Mar 20 13:24:09 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 13:24:09 crc restorecon[4705]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:09 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:24:10 crc restorecon[4705]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 13:24:10 crc kubenswrapper[4849]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:24:10 crc kubenswrapper[4849]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 13:24:10 crc kubenswrapper[4849]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:24:10 crc kubenswrapper[4849]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:24:10 crc kubenswrapper[4849]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 13:24:10 crc kubenswrapper[4849]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.836319 4849 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842329 4849 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842364 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842374 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842383 4849 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842412 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842420 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842428 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842436 4849 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842444 4849 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842452 4849 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842460 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842468 4849 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842475 4849 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842483 4849 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842491 4849 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842499 4849 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842507 4849 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842515 4849 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842523 4849 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842531 4849 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842538 4849 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842546 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842553 4849 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842561 4849 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842569 4849 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842577 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842584 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842591 4849 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842600 4849 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842608 4849 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842619 4849 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842629 4849 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842639 4849 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842647 4849 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842655 4849 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842663 4849 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842672 4849 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842680 4849 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842687 4849 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842695 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842713 4849 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842722 4849 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842729 4849 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842737 4849 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842746 4849 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842756 4849 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842765 4849 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842776 4849 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842786 4849 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842795 4849 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842803 4849 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842812 4849 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842867 4849 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842886 4849 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842897 4849 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842906 4849 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842914 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842922 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842930 4849 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842938 4849 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842947 4849 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842955 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842963 4849 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842976 4849 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842985 4849 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.842994 4849 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.843002 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.843012 4849 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.843022 4849 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.843030 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.843038 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843256 4849 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843281 4849 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843334 4849 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843349 4849 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843364 4849 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843393 4849 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843409 4849 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843420 4849 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843430 4849 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843441 4849 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843452 4849 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843461 4849 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843470 4849 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843481 4849 flags.go:64] FLAG: --cgroup-root="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843492 4849 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843504 4849 flags.go:64] FLAG: --client-ca-file="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843514 4849 flags.go:64] FLAG: --cloud-config="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843525 4849 flags.go:64] FLAG: --cloud-provider="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843537 4849 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843560 4849 flags.go:64] FLAG: --cluster-domain="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843572 4849 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843584 4849 flags.go:64] FLAG: --config-dir="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843595 4849 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843608 4849 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843623 4849 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843635 4849 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843647 4849 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843659 4849 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843671 4849 flags.go:64] FLAG: --contention-profiling="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843683 4849 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843694 4849 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843707 4849 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843717 4849 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843729 4849 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843738 4849 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843747 4849 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843756 4849 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843766 4849 flags.go:64] FLAG: --enable-server="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843776 4849 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843795 4849 flags.go:64] FLAG: --event-burst="100" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843805 4849 flags.go:64] FLAG: --event-qps="50" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843880 4849 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843903 4849 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843913 4849 flags.go:64] FLAG: --eviction-hard="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843925 4849 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843935 4849 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843945 4849 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843954 4849 flags.go:64] FLAG: --eviction-soft="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843964 4849 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843974 4849 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843983 4849 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.843993 4849 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844002 4849 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844011 4849 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844020 4849 flags.go:64] FLAG: --feature-gates="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844031 4849 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844041 4849 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844050 4849 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844060 4849 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844069 4849 flags.go:64] FLAG: --healthz-port="10248" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844078 4849 flags.go:64] FLAG: --help="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844087 4849 flags.go:64] FLAG: --hostname-override="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844096 4849 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844105 4849 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844115 4849 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844123 4849 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844132 4849 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844143 4849 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844155 4849 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844177 4849 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844196 4849 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844209 4849 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844222 4849 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844233 4849 flags.go:64] FLAG: --kube-reserved="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844245 4849 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844256 4849 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844268 4849 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844299 4849 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844312 4849 flags.go:64] FLAG: --lock-file="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844324 4849 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844336 4849 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844348 4849 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844365 4849 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844377 4849 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844386 4849 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844395 4849 flags.go:64] FLAG: --logging-format="text" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844404 4849 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844414 4849 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844426 4849 flags.go:64] FLAG: --manifest-url="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844437 4849 flags.go:64] FLAG: --manifest-url-header="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844452 4849 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844464 4849 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844477 4849 flags.go:64] FLAG: --max-pods="110" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844489 4849 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844500 4849 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844512 4849 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844522 4849 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844533 4849 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844545 4849 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844557 4849 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844582 4849 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844599 4849 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844611 4849 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844624 4849 flags.go:64] FLAG: --pod-cidr="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844635 4849 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844652 4849 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844662 4849 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844672 4849 flags.go:64] FLAG: --pods-per-core="0" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844681 4849 flags.go:64] FLAG: --port="10250" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844692 4849 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844704 4849 flags.go:64] FLAG: --provider-id="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844715 4849 flags.go:64] FLAG: --qos-reserved="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844725 4849 flags.go:64] FLAG: --read-only-port="10255" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844751 4849 flags.go:64] FLAG: --register-node="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844761 4849 flags.go:64] FLAG: --register-schedulable="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844770 4849 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844789 4849 flags.go:64] FLAG: --registry-burst="10" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844799 4849 flags.go:64] FLAG: --registry-qps="5" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844808 4849 flags.go:64] FLAG: --reserved-cpus="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844860 4849 flags.go:64] FLAG: --reserved-memory="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844887 4849 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844898 4849 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844907 4849 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844917 4849 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844927 4849 flags.go:64] FLAG: --runonce="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844936 4849 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844946 4849 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844955 4849 flags.go:64] FLAG: --seccomp-default="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844964 4849 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844974 4849 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844984 4849 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.844993 4849 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845002 4849 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845012 4849 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845027 4849 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845036 4849 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845045 4849 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845055 4849 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845064 4849 flags.go:64] FLAG: --system-cgroups="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845072 4849 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845089 4849 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845098 4849 flags.go:64] FLAG: --tls-cert-file="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845107 4849 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845129 4849 flags.go:64] FLAG: --tls-min-version="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845139 4849 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845151 4849 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845175 4849 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845190 4849 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845202 4849 flags.go:64] FLAG: --v="2" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845252 4849 flags.go:64] FLAG: --version="false" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845265 4849 flags.go:64] FLAG: --vmodule="" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845276 4849 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.845286 4849 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845634 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845649 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845658 4849 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845666 4849 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845674 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845682 4849 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845689 4849 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845698 4849 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845705 4849 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845789 4849 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845799 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845808 4849 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845857 4849 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845877 4849 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845887 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845964 4849 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845975 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845985 4849 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.845995 4849 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846005 4849 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846015 4849 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846025 4849 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846035 4849 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846046 4849 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846055 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846065 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846074 4849 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846083 4849 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846093 4849 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846103 4849 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846115 4849 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846126 4849 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846164 4849 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846187 4849 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846202 4849 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846213 4849 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846222 4849 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846231 4849 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846240 4849 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846249 4849 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846257 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846265 4849 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846273 4849 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846282 4849 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846290 4849 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846298 4849 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846306 4849 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846314 4849 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846322 4849 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846342 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846350 4849 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846360 4849 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846373 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846385 4849 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846396 4849 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846415 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846434 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846445 4849 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846454 4849 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846478 4849 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846488 4849 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846520 4849 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846532 4849 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846540 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846548 4849 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846557 4849 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846568 4849 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846578 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846598 4849 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846606 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.846615 4849 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.846640 4849 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.859038 4849 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.859079 4849 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859150 4849 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859160 4849 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859164 4849 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859168 4849 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859173 4849 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859179 4849 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859183 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859187 4849 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859191 4849 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859195 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859198 4849 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859202 4849 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859206 4849 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859211 4849 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859215 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859219 4849 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859223 4849 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859226 4849 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859229 4849 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859234 4849 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859238 4849 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859241 4849 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859248 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859252 4849 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859257 4849 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859263 4849 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859268 4849 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859273 4849 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859278 4849 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859282 4849 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859286 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859290 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859293 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859297 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859301 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859305 4849 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859308 4849 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859313 4849 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859319 4849 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859323 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859327 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859331 4849 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859335 4849 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859339 4849 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859343 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859347 4849 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859351 4849 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859355 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859358 4849 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859362 4849 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859367 4849 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859371 4849 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859375 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859378 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859382 4849 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859386 4849 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859389 4849 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859393 4849 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859396 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859400 4849 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859403 4849 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859407 4849 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859410 4849 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859415 4849 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859420 4849 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859424 4849 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859428 4849 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859432 4849 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859435 4849 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859440 4849 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859444 4849 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.859450 4849 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859581 4849 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859590 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859594 4849 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859598 4849 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859602 4849 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859606 4849 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859609 4849 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859613 4849 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859617 4849 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859621 4849 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859624 4849 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859628 4849 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859632 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859636 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859640 4849 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859643 4849 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859647 4849 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859652 4849 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859656 4849 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859660 4849 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859665 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859669 4849 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859673 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859676 4849 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859680 4849 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859684 4849 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859687 4849 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859690 4849 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859694 4849 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859697 4849 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859701 4849 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859705 4849 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859709 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859712 4849 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859716 4849 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859720 4849 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859723 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859728 4849 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859732 4849 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859737 4849 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859743 4849 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859749 4849 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859755 4849 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859761 4849 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859791 4849 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859798 4849 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859802 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859806 4849 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859810 4849 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859814 4849 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859835 4849 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859840 4849 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859851 4849 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859854 4849 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859858 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859861 4849 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859865 4849 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859869 4849 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859874 4849 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859878 4849 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859882 4849 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859886 4849 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859890 4849 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859894 4849 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859897 4849 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859901 4849 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859904 4849 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859908 4849 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859912 4849 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859915 4849 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.859919 4849 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.859924 4849 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.860086 4849 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.863579 4849 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.867480 4849 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.867601 4849 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.869197 4849 server.go:997] "Starting client certificate rotation" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.869230 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.869392 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.895652 4849 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.899447 4849 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.899490 4849 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.915669 4849 log.go:25] "Validated CRI v1 runtime API" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.944201 4849 log.go:25] "Validated CRI v1 image API" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.946224 4849 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.949857 4849 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-13-18-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.949911 4849 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.965447 4849 manager.go:217] Machine: {Timestamp:2026-03-20 13:24:10.963881022 +0000 UTC m=+0.641604437 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5558133e-3d97-4e22-9873-bad3dbc7167b BootID:c9268129-01d7-4b12-98d7-58087a6062f7 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:fc:59:05 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:fc:59:05 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:83:c3:6d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cf:95:af Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9a:a5:89 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:aa:99:83 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:94:3f:ab:e6:58 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:8d:76:db:fd:4f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.966100 4849 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.966294 4849 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.966938 4849 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967102 4849 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967143 4849 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967327 4849 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967337 4849 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967764 4849 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967794 4849 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.967989 4849 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.968062 4849 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.970925 4849 kubelet.go:418] "Attempting to sync node with API server" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.970950 4849 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.970966 4849 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.970979 4849 kubelet.go:324] "Adding apiserver pod source" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.970991 4849 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.974340 4849 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.975482 4849 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.977749 4849 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.978615 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.978631 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.978752 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.978769 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979396 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979423 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979430 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979437 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979448 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979454 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979461 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979471 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979478 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979486 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979497 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.979503 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.980284 4849 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.980698 4849 server.go:1280] "Started kubelet" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.980886 4849 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.981080 4849 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.981716 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.982183 4849 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 13:24:10 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.982869 4849 server.go:460] "Adding debug handlers to kubelet server" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.983801 4849 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.984681 4849 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.984753 4849 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.988992 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.989282 4849 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.989649 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.989704 4849 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 13:24:10 crc kubenswrapper[4849]: W0320 13:24:10.990169 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.990232 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.990501 4849 factory.go:55] Registering systemd factory Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.990588 4849 factory.go:221] Registration of the systemd container factory successfully Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.991232 4849 factory.go:153] Registering CRI-O factory Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.991277 4849 factory.go:221] Registration of the crio container factory successfully Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.991368 4849 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.991403 4849 factory.go:103] Registering Raw factory Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.991422 4849 manager.go:1196] Started watching for new ooms in manager Mar 20 13:24:10 crc kubenswrapper[4849]: I0320 13:24:10.993720 4849 manager.go:319] Starting recovery of all containers Mar 20 13:24:10 crc kubenswrapper[4849]: E0320 13:24:10.991201 4849 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8f78829c2e4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:10.980675149 +0000 UTC m=+0.658398534,LastTimestamp:2026-03-20 13:24:10.980675149 +0000 UTC m=+0.658398534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000206 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000269 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000281 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000293 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000304 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000317 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000328 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000351 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000368 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000381 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000401 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000409 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000417 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000429 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000438 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000447 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000458 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000467 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000510 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000521 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000531 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000540 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000550 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000559 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000567 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000597 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000609 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000619 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000628 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000636 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000645 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000655 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000665 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000674 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000682 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000690 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000700 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000710 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000720 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000728 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000737 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000747 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000756 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000765 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000774 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000785 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000796 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000807 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000839 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000848 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000857 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000866 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000878 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000892 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000901 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000909 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000919 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000927 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000935 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000943 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000953 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000962 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000970 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000978 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000988 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.000996 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001005 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001014 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001023 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001032 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001040 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001048 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001057 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001066 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001075 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001085 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001094 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001102 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001111 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001119 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001129 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001138 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001147 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001155 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001164 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001176 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001185 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001194 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001204 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001213 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001222 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001230 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001239 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001247 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001255 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001264 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001273 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001280 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001288 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001297 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001305 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001313 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001322 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001330 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001342 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001353 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001362 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001372 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001382 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001392 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001407 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001416 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001425 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001434 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001442 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001451 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001458 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001469 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001478 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001487 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001497 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001505 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001513 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001521 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001529 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001537 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001546 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001555 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001570 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001603 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001614 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001622 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001631 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001640 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001648 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001657 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001666 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001675 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001686 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001695 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001704 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001712 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001721 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001731 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001742 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001751 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001760 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001768 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001777 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001787 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001796 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001805 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001831 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001840 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001849 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001859 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001867 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001876 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001885 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001893 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001904 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001913 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001922 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.001932 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.003806 4849 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.003879 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.003923 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.003957 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.003975 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.003992 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004007 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004021 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004038 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004054 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004069 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004083 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004097 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004111 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004124 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004138 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004152 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004178 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004191 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004203 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004215 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004227 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004241 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004252 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004263 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004273 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004284 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004293 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004305 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004314 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004324 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004332 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004343 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004353 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004363 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004372 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004384 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004392 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004402 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004417 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004428 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004437 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004447 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004456 4849 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004467 4849 reconstruct.go:97] "Volume reconstruction finished" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.004498 4849 reconciler.go:26] "Reconciler: start to sync state" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.008975 4849 manager.go:324] Recovery completed Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.017793 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.020563 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.020606 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.020618 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.021374 4849 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.021392 4849 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.021452 4849 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.031219 4849 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.033069 4849 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.034435 4849 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.034500 4849 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.034555 4849 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.036186 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.036272 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.040491 4849 policy_none.go:49] "None policy: Start" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.041297 4849 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.041321 4849 state_mem.go:35] "Initializing new in-memory state store" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.089914 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.092988 4849 manager.go:334] "Starting Device Plugin manager" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.093053 4849 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.093070 4849 server.go:79] "Starting device plugin registration server" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.093570 4849 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.093588 4849 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.093961 4849 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.094064 4849 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.094080 4849 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.099809 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.134975 4849 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.135108 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.136595 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.136637 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.136646 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.136779 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137138 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137221 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137543 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137565 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137573 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137659 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137872 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.137919 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138354 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138395 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138411 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138672 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138698 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138707 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138687 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138727 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138736 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138853 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138902 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.138930 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139376 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139409 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139421 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139549 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139658 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139721 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139668 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139807 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.139828 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140257 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140281 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140293 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140459 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140495 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140515 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140542 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.140554 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.141084 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.141122 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.141135 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.190425 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.194612 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.195663 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.195694 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.195703 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.195724 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.196151 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207277 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207311 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207337 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207358 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207373 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207389 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207407 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207437 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207490 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207537 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207577 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207604 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207622 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207650 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.207666 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308395 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308443 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308499 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308524 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308535 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308577 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308620 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308569 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308634 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308590 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308711 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308740 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308763 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308787 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308792 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308809 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308848 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308853 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308880 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308876 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308923 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308972 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309022 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.308944 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309046 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309065 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309072 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309095 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309184 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.309256 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.396871 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.398201 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.398258 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.398271 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.398342 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.398872 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.474167 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.480984 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.500351 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.521132 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ed356a1d20aa248ecc7ced9522bad81e6f7af79aab9559316463bdb51e3a157b WatchSource:0}: Error finding container ed356a1d20aa248ecc7ced9522bad81e6f7af79aab9559316463bdb51e3a157b: Status 404 returned error can't find the container with id ed356a1d20aa248ecc7ced9522bad81e6f7af79aab9559316463bdb51e3a157b Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.521508 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.526352 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.528482 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-fe50ebd92837a89433a8b00758094f72da4fa482c6c4b0c084bbc7db99364b60 WatchSource:0}: Error finding container fe50ebd92837a89433a8b00758094f72da4fa482c6c4b0c084bbc7db99364b60: Status 404 returned error can't find the container with id fe50ebd92837a89433a8b00758094f72da4fa482c6c4b0c084bbc7db99364b60 Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.543565 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-52a2fca3329569c5aac991d09c3500ba151802aa182184520808d55d81f756c4 WatchSource:0}: Error finding container 52a2fca3329569c5aac991d09c3500ba151802aa182184520808d55d81f756c4: Status 404 returned error can't find the container with id 52a2fca3329569c5aac991d09c3500ba151802aa182184520808d55d81f756c4 Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.547860 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4c145a5c2c56a0eea6090d312e02f7e024c17ac0ab685a6bc8f4e52112a82562 WatchSource:0}: Error finding container 4c145a5c2c56a0eea6090d312e02f7e024c17ac0ab685a6bc8f4e52112a82562: Status 404 returned error can't find the container with id 4c145a5c2c56a0eea6090d312e02f7e024c17ac0ab685a6bc8f4e52112a82562 Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.591489 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.799688 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.801278 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.801318 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.801330 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.801352 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.801964 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.906476 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.906618 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:11 crc kubenswrapper[4849]: W0320 13:24:11.920089 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:11 crc kubenswrapper[4849]: E0320 13:24:11.920275 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:11 crc kubenswrapper[4849]: I0320 13:24:11.983243 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:12 crc kubenswrapper[4849]: W0320 13:24:12.030187 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:12 crc kubenswrapper[4849]: E0320 13:24:12.030301 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.038281 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fe50ebd92837a89433a8b00758094f72da4fa482c6c4b0c084bbc7db99364b60"} Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.039085 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dcb28b586d3c6822743ea0410eed0158c92e16eb3103f7bfab111fd938868864"} Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.039703 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ed356a1d20aa248ecc7ced9522bad81e6f7af79aab9559316463bdb51e3a157b"} Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.040575 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4c145a5c2c56a0eea6090d312e02f7e024c17ac0ab685a6bc8f4e52112a82562"} Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.041392 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"52a2fca3329569c5aac991d09c3500ba151802aa182184520808d55d81f756c4"} Mar 20 13:24:12 crc kubenswrapper[4849]: E0320 13:24:12.393296 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Mar 20 13:24:12 crc kubenswrapper[4849]: W0320 13:24:12.572926 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:12 crc kubenswrapper[4849]: E0320 13:24:12.573065 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.602604 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.604507 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.604563 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.604582 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.604628 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:12 crc kubenswrapper[4849]: E0320 13:24:12.605414 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 20 13:24:12 crc kubenswrapper[4849]: I0320 13:24:12.983288 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.047978 4849 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e" exitCode=0 Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.048083 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.048207 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.049843 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.049897 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.049911 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.053292 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.053335 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.053357 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.053370 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.053472 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.054293 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.054320 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.054331 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.059903 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1" exitCode=0 Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.060014 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.060065 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.061227 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.061270 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.061284 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.063501 4849 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0" exitCode=0 Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.063615 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.063897 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.064506 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.065346 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.065413 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.065426 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.065437 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.065473 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.065484 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.066925 4849 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="003efb7e4024827de6fbf19b52af32b65ed3498dee9bae9127b5df2d13ea3711" exitCode=0 Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.066968 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"003efb7e4024827de6fbf19b52af32b65ed3498dee9bae9127b5df2d13ea3711"} Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.067058 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.068674 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.068701 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.068712 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.074427 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:24:13 crc kubenswrapper[4849]: E0320 13:24:13.075396 4849 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.218532 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.472560 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:13 crc kubenswrapper[4849]: W0320 13:24:13.654921 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:13 crc kubenswrapper[4849]: E0320 13:24:13.655440 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:13 crc kubenswrapper[4849]: I0320 13:24:13.983313 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:13 crc kubenswrapper[4849]: E0320 13:24:13.994304 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.071766 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.071805 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.071832 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.071807 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.072741 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.072779 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.072789 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.076319 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.076364 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.076379 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.076388 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.082909 4849 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8" exitCode=0 Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.083056 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.083076 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.083744 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.083773 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.083784 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.087876 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.088325 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.088568 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6651f66bbfda244662fbafe6b03ba13712cb012dc7ffff1aa006805a3e29c443"} Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.088601 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.088949 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.088976 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.088987 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.089014 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.089030 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.089039 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.206448 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.207721 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.207752 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.207796 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:14 crc kubenswrapper[4849]: I0320 13:24:14.207873 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:14 crc kubenswrapper[4849]: E0320 13:24:14.208251 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 20 13:24:14 crc kubenswrapper[4849]: W0320 13:24:14.368583 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 20 13:24:14 crc kubenswrapper[4849]: E0320 13:24:14.368651 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.093052 4849 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e" exitCode=0 Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.093164 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.093163 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e"} Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.094419 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.094514 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.094538 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.096996 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9dd39a973bfc6cc7d30cdaa39f2aa3e43c74512afb896b91404533b61e51475a"} Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.097051 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.097079 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.097105 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.097115 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.097202 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.098190 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.098215 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.098227 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099081 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099133 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099149 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099173 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099194 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099205 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099089 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099241 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:15 crc kubenswrapper[4849]: I0320 13:24:15.099252 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105041 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105067 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105102 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105022 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356"} Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105218 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d"} Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105241 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93"} Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105262 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82"} Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.105282 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2"} Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.108072 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.108387 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.108862 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.108907 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.108923 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.110034 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.110080 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.110106 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.111634 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.111667 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.111680 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:16 crc kubenswrapper[4849]: I0320 13:24:16.817679 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.108469 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.109930 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.109996 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.110019 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.184815 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.289782 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.290052 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.290116 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.291900 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.291956 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.291974 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.351487 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.409247 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.410846 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.410893 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.410909 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:17 crc kubenswrapper[4849]: I0320 13:24:17.410944 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.110600 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.110650 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.110606 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.111546 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.111567 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.111575 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.112213 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.112237 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.112244 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.182146 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.182379 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.183763 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.183867 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:18 crc kubenswrapper[4849]: I0320 13:24:18.183893 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:20 crc kubenswrapper[4849]: I0320 13:24:20.047150 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 13:24:20 crc kubenswrapper[4849]: I0320 13:24:20.047345 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:20 crc kubenswrapper[4849]: I0320 13:24:20.048674 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:20 crc kubenswrapper[4849]: I0320 13:24:20.048748 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:20 crc kubenswrapper[4849]: I0320 13:24:20.048775 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:21 crc kubenswrapper[4849]: E0320 13:24:21.100045 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.182514 4849 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.182616 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.356716 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.357313 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.359705 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.359758 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:21 crc kubenswrapper[4849]: I0320 13:24:21.359777 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:22 crc kubenswrapper[4849]: I0320 13:24:22.246544 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:22 crc kubenswrapper[4849]: I0320 13:24:22.246648 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:22 crc kubenswrapper[4849]: I0320 13:24:22.248238 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:22 crc kubenswrapper[4849]: I0320 13:24:22.248303 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:22 crc kubenswrapper[4849]: I0320 13:24:22.248319 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:22 crc kubenswrapper[4849]: I0320 13:24:22.868066 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:23 crc kubenswrapper[4849]: I0320 13:24:23.126783 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:23 crc kubenswrapper[4849]: I0320 13:24:23.128016 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:23 crc kubenswrapper[4849]: I0320 13:24:23.128047 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:23 crc kubenswrapper[4849]: I0320 13:24:23.128056 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:24 crc kubenswrapper[4849]: W0320 13:24:24.406195 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.406357 4849 trace.go:236] Trace[1679370464]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:24:14.405) (total time: 10001ms): Mar 20 13:24:24 crc kubenswrapper[4849]: Trace[1679370464]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (13:24:24.406) Mar 20 13:24:24 crc kubenswrapper[4849]: Trace[1679370464]: [10.001109252s] [10.001109252s] END Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.406393 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:24:24 crc kubenswrapper[4849]: W0320 13:24:24.654233 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.654315 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.654512 4849 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f78829c2e4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:10.980675149 +0000 UTC m=+0.658398534,LastTimestamp:2026-03-20 13:24:10.980675149 +0000 UTC m=+0.658398534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:24 crc kubenswrapper[4849]: W0320 13:24:24.666573 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.666658 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.670546 4849 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.671759 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.672866 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.673411 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:24:24 crc kubenswrapper[4849]: W0320 13:24:24.675197 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z Mar 20 13:24:24 crc kubenswrapper[4849]: E0320 13:24:24.675263 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.677769 4849 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.677853 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.698132 4849 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]log ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]etcd ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-apiextensions-informers ok Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 13:24:24 crc kubenswrapper[4849]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]autoregister-completion ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 13:24:24 crc kubenswrapper[4849]: livez check failed Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.698183 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:24 crc kubenswrapper[4849]: I0320 13:24:24.985069 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:24Z is after 2026-02-23T05:33:13Z Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.133095 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.135282 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9dd39a973bfc6cc7d30cdaa39f2aa3e43c74512afb896b91404533b61e51475a" exitCode=255 Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.135328 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9dd39a973bfc6cc7d30cdaa39f2aa3e43c74512afb896b91404533b61e51475a"} Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.135457 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.136333 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.136363 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.136372 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.136794 4849 scope.go:117] "RemoveContainer" containerID="9dd39a973bfc6cc7d30cdaa39f2aa3e43c74512afb896b91404533b61e51475a" Mar 20 13:24:25 crc kubenswrapper[4849]: I0320 13:24:25.984593 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:25Z is after 2026-02-23T05:33:13Z Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.138811 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.139339 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.140759 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" exitCode=255 Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.140797 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb"} Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.140846 4849 scope.go:117] "RemoveContainer" containerID="9dd39a973bfc6cc7d30cdaa39f2aa3e43c74512afb896b91404533b61e51475a" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.140925 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.141582 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.141605 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.141614 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.142040 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:26 crc kubenswrapper[4849]: E0320 13:24:26.142197 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:26 crc kubenswrapper[4849]: I0320 13:24:26.984899 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:26Z is after 2026-02-23T05:33:13Z Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.145419 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.296441 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.296598 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.298135 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.298340 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.298538 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.299787 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:27 crc kubenswrapper[4849]: E0320 13:24:27.300305 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.300881 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:27 crc kubenswrapper[4849]: I0320 13:24:27.985503 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:27Z is after 2026-02-23T05:33:13Z Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.110147 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.149761 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.150472 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.150509 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.150521 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.151078 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:28 crc kubenswrapper[4849]: E0320 13:24:28.151251 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:28 crc kubenswrapper[4849]: W0320 13:24:28.778652 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:28Z is after 2026-02-23T05:33:13Z Mar 20 13:24:28 crc kubenswrapper[4849]: E0320 13:24:28.778722 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:24:28 crc kubenswrapper[4849]: I0320 13:24:28.986331 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:28Z is after 2026-02-23T05:33:13Z Mar 20 13:24:29 crc kubenswrapper[4849]: I0320 13:24:29.151725 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:29 crc kubenswrapper[4849]: I0320 13:24:29.152460 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4849]: I0320 13:24:29.152497 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4849]: I0320 13:24:29.152509 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4849]: I0320 13:24:29.153013 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:29 crc kubenswrapper[4849]: E0320 13:24:29.153166 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:29 crc kubenswrapper[4849]: W0320 13:24:29.397535 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:29Z is after 2026-02-23T05:33:13Z Mar 20 13:24:29 crc kubenswrapper[4849]: E0320 13:24:29.397614 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:24:29 crc kubenswrapper[4849]: I0320 13:24:29.986866 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:29Z is after 2026-02-23T05:33:13Z Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.074336 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.074485 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.075455 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.075495 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.075505 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.089070 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.153760 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.154404 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.154435 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.154446 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4849]: I0320 13:24:30.984591 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2026-02-23T05:33:13Z Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.073556 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.074773 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.074802 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.074812 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.074854 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:31 crc kubenswrapper[4849]: E0320 13:24:31.077205 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:24:31 crc kubenswrapper[4849]: E0320 13:24:31.078030 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:24:31 crc kubenswrapper[4849]: E0320 13:24:31.100147 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.183423 4849 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.183537 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.356785 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.357084 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.358186 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.358224 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.358261 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.358883 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:31 crc kubenswrapper[4849]: E0320 13:24:31.359077 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:31 crc kubenswrapper[4849]: I0320 13:24:31.986242 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:32 crc kubenswrapper[4849]: W0320 13:24:32.516920 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:32 crc kubenswrapper[4849]: E0320 13:24:32.516984 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:32 crc kubenswrapper[4849]: W0320 13:24:32.517206 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:24:32 crc kubenswrapper[4849]: E0320 13:24:32.517280 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:32 crc kubenswrapper[4849]: I0320 13:24:32.988814 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:33 crc kubenswrapper[4849]: I0320 13:24:33.105742 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:24:33 crc kubenswrapper[4849]: I0320 13:24:33.120955 4849 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:24:33 crc kubenswrapper[4849]: I0320 13:24:33.986443 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.661839 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f78829c2e4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:10.980675149 +0000 UTC m=+0.658398534,LastTimestamp:2026-03-20 13:24:10.980675149 +0000 UTC m=+0.658398534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.669059 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.673620 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.677280 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.683617 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f788986a279 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.096703609 +0000 UTC m=+0.774427004,LastTimestamp:2026-03-20 13:24:11.096703609 +0000 UTC m=+0.774427004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.689194 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.136622795 +0000 UTC m=+0.814346190,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.695558 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.136642894 +0000 UTC m=+0.814366289,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.703302 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fdc228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.136651784 +0000 UTC m=+0.814375169,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.710005 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.137559166 +0000 UTC m=+0.815282561,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.713592 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.137570666 +0000 UTC m=+0.815294061,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.717421 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fdc228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.137578746 +0000 UTC m=+0.815302141,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.721294 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.138386132 +0000 UTC m=+0.816109527,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.725313 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.138405611 +0000 UTC m=+0.816129016,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.728741 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fdc228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.138418021 +0000 UTC m=+0.816141426,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.734232 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.138691082 +0000 UTC m=+0.816414467,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.739544 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.138703882 +0000 UTC m=+0.816427267,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.743388 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fdc228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.138711982 +0000 UTC m=+0.816435377,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.748513 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.138720752 +0000 UTC m=+0.816444147,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.753808 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.138733111 +0000 UTC m=+0.816456506,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.758531 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fdc228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.138741331 +0000 UTC m=+0.816464726,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.763777 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.139396851 +0000 UTC m=+0.817120246,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.768524 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.139416941 +0000 UTC m=+0.817140336,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.774089 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fdc228\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fdc228 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020624424 +0000 UTC m=+0.698347819,LastTimestamp:2026-03-20 13:24:11.1394271 +0000 UTC m=+0.817150495,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.778457 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd4b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd4b17 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020593943 +0000 UTC m=+0.698317348,LastTimestamp:2026-03-20 13:24:11.139797539 +0000 UTC m=+0.817520934,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.782091 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f7884fd98fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f7884fd98fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.020613884 +0000 UTC m=+0.698337279,LastTimestamp:2026-03-20 13:24:11.139813289 +0000 UTC m=+0.817536674,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.788143 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f78a376c285 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.531870853 +0000 UTC m=+1.209594248,LastTimestamp:2026-03-20 13:24:11.531870853 +0000 UTC m=+1.209594248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.791690 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f78a3834abc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.532692156 +0000 UTC m=+1.210415551,LastTimestamp:2026-03-20 13:24:11.532692156 +0000 UTC m=+1.210415551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.797525 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f78a38449dd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.532757469 +0000 UTC m=+1.210480864,LastTimestamp:2026-03-20 13:24:11.532757469 +0000 UTC m=+1.210480864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.801991 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78a46efbff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.548138495 +0000 UTC m=+1.225861890,LastTimestamp:2026-03-20 13:24:11.548138495 +0000 UTC m=+1.225861890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.805526 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f78a49bc809 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:11.551074313 +0000 UTC m=+1.228797708,LastTimestamp:2026-03-20 13:24:11.551074313 +0000 UTC m=+1.228797708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.809883 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78c5943383 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.104225667 +0000 UTC m=+1.781949062,LastTimestamp:2026-03-20 13:24:12.104225667 +0000 UTC m=+1.781949062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.814948 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f78c5b1e4da openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.10617161 +0000 UTC m=+1.783895005,LastTimestamp:2026-03-20 13:24:12.10617161 +0000 UTC m=+1.783895005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.819703 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f78c5bf3c50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.107045968 +0000 UTC m=+1.784769373,LastTimestamp:2026-03-20 13:24:12.107045968 +0000 UTC m=+1.784769373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.823866 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f78c5c08c04 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.107131908 +0000 UTC m=+1.784855303,LastTimestamp:2026-03-20 13:24:12.107131908 +0000 UTC m=+1.784855303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.828504 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f78c5ca0ea4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.107755172 +0000 UTC m=+1.785478567,LastTimestamp:2026-03-20 13:24:12.107755172 +0000 UTC m=+1.785478567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.832751 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78c6361e9e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.11483715 +0000 UTC m=+1.792560545,LastTimestamp:2026-03-20 13:24:12.11483715 +0000 UTC m=+1.792560545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.836562 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78c64ba020 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.11624656 +0000 UTC m=+1.793969955,LastTimestamp:2026-03-20 13:24:12.11624656 +0000 UTC m=+1.793969955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.840558 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f78c658707b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.117086331 +0000 UTC m=+1.794809736,LastTimestamp:2026-03-20 13:24:12.117086331 +0000 UTC m=+1.794809736,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.843805 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f78c668146e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.118111342 +0000 UTC m=+1.795834737,LastTimestamp:2026-03-20 13:24:12.118111342 +0000 UTC m=+1.795834737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.847588 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f78c69ab9cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.121430479 +0000 UTC m=+1.799153874,LastTimestamp:2026-03-20 13:24:12.121430479 +0000 UTC m=+1.799153874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.850891 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f78c6c7e1df openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.124389855 +0000 UTC m=+1.802113250,LastTimestamp:2026-03-20 13:24:12.124389855 +0000 UTC m=+1.802113250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.855056 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78d82f5450 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.416382032 +0000 UTC m=+2.094105427,LastTimestamp:2026-03-20 13:24:12.416382032 +0000 UTC m=+2.094105427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.859065 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78d8eae85a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.428675162 +0000 UTC m=+2.106398587,LastTimestamp:2026-03-20 13:24:12.428675162 +0000 UTC m=+2.106398587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.863778 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78d9088344 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.430615364 +0000 UTC m=+2.108338759,LastTimestamp:2026-03-20 13:24:12.430615364 +0000 UTC m=+2.108338759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.868324 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78e6e0a07a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.662882426 +0000 UTC m=+2.340605901,LastTimestamp:2026-03-20 13:24:12.662882426 +0000 UTC m=+2.340605901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.872559 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78e7afebcd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.676467661 +0000 UTC m=+2.354191086,LastTimestamp:2026-03-20 13:24:12.676467661 +0000 UTC m=+2.354191086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.875943 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78e7c6a9e9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.677958121 +0000 UTC m=+2.355681556,LastTimestamp:2026-03-20 13:24:12.677958121 +0000 UTC m=+2.355681556,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.879518 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78f3fa85c5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.882683333 +0000 UTC m=+2.560406728,LastTimestamp:2026-03-20 13:24:12.882683333 +0000 UTC m=+2.560406728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.884089 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78f4a590d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.893892824 +0000 UTC m=+2.571616219,LastTimestamp:2026-03-20 13:24:12.893892824 +0000 UTC m=+2.571616219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.888003 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f78fe138d01 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.052095745 +0000 UTC m=+2.729819140,LastTimestamp:2026-03-20 13:24:13.052095745 +0000 UTC m=+2.729819140,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.892451 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f78fecd722d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.064278573 +0000 UTC m=+2.742001968,LastTimestamp:2026-03-20 13:24:13.064278573 +0000 UTC m=+2.742001968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.896869 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f78fef8ede1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.067128289 +0000 UTC m=+2.744851684,LastTimestamp:2026-03-20 13:24:13.067128289 +0000 UTC m=+2.744851684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.901053 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f78ff21a1b1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.069795761 +0000 UTC m=+2.747519156,LastTimestamp:2026-03-20 13:24:13.069795761 +0000 UTC m=+2.747519156,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.904874 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f790b352f90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.272403856 +0000 UTC m=+2.950127251,LastTimestamp:2026-03-20 13:24:13.272403856 +0000 UTC m=+2.950127251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.909483 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f790b59cd18 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.27480348 +0000 UTC m=+2.952526875,LastTimestamp:2026-03-20 13:24:13.27480348 +0000 UTC m=+2.952526875,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.914995 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f790b61b374 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.275321204 +0000 UTC m=+2.953044599,LastTimestamp:2026-03-20 13:24:13.275321204 +0000 UTC m=+2.953044599,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.919288 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f790bc46888 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.281790088 +0000 UTC m=+2.959513483,LastTimestamp:2026-03-20 13:24:13.281790088 +0000 UTC m=+2.959513483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.923194 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f790be2d3b2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.283783602 +0000 UTC m=+2.961506987,LastTimestamp:2026-03-20 13:24:13.283783602 +0000 UTC m=+2.961506987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.927588 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f790bfe782c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.28559518 +0000 UTC m=+2.963318575,LastTimestamp:2026-03-20 13:24:13.28559518 +0000 UTC m=+2.963318575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.932086 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f790c100309 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.286744841 +0000 UTC m=+2.964468236,LastTimestamp:2026-03-20 13:24:13.286744841 +0000 UTC m=+2.964468236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.936292 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f790c2e0d3e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.288713534 +0000 UTC m=+2.966436929,LastTimestamp:2026-03-20 13:24:13.288713534 +0000 UTC m=+2.966436929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.941208 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f790c55b7da openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.291313114 +0000 UTC m=+2.969036499,LastTimestamp:2026-03-20 13:24:13.291313114 +0000 UTC m=+2.969036499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.946650 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f79176e22a0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.477462688 +0000 UTC m=+3.155186083,LastTimestamp:2026-03-20 13:24:13.477462688 +0000 UTC m=+3.155186083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.950957 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7917f7d808 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.48648756 +0000 UTC m=+3.164210955,LastTimestamp:2026-03-20 13:24:13.48648756 +0000 UTC m=+3.164210955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.955291 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f7918188f6c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.48863166 +0000 UTC m=+3.166355055,LastTimestamp:2026-03-20 13:24:13.48863166 +0000 UTC m=+3.166355055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.959301 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f79182b5abf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.489863359 +0000 UTC m=+3.167586754,LastTimestamp:2026-03-20 13:24:13.489863359 +0000 UTC m=+3.167586754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.962931 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7918ee4ba1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.502639009 +0000 UTC m=+3.180362404,LastTimestamp:2026-03-20 13:24:13.502639009 +0000 UTC m=+3.180362404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.966615 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7919023503 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.503943939 +0000 UTC m=+3.181667374,LastTimestamp:2026-03-20 13:24:13.503943939 +0000 UTC m=+3.181667374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.970661 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f7922b19c52 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.66643413 +0000 UTC m=+3.344157515,LastTimestamp:2026-03-20 13:24:13.66643413 +0000 UTC m=+3.344157515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.974750 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7922ca1816 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.668038678 +0000 UTC m=+3.345762073,LastTimestamp:2026-03-20 13:24:13.668038678 +0000 UTC m=+3.345762073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.977926 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f792391394c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.681088844 +0000 UTC m=+3.358812229,LastTimestamp:2026-03-20 13:24:13.681088844 +0000 UTC m=+3.358812229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.981408 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7923ad9b54 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.682948948 +0000 UTC m=+3.360672343,LastTimestamp:2026-03-20 13:24:13.682948948 +0000 UTC m=+3.360672343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: I0320 13:24:34.984948 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.984885 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7923c1bd32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.684268338 +0000 UTC m=+3.361991733,LastTimestamp:2026-03-20 13:24:13.684268338 +0000 UTC m=+3.361991733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.986572 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f792b08b4aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.806359722 +0000 UTC m=+3.484083117,LastTimestamp:2026-03-20 13:24:13.806359722 +0000 UTC m=+3.484083117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.988762 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f792e25ba07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.858593287 +0000 UTC m=+3.536316672,LastTimestamp:2026-03-20 13:24:13.858593287 +0000 UTC m=+3.536316672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.990688 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f792e9ea283 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.866517123 +0000 UTC m=+3.544240518,LastTimestamp:2026-03-20 13:24:13.866517123 +0000 UTC m=+3.544240518,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.992353 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f792eb0caa1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.867707041 +0000 UTC m=+3.545430436,LastTimestamp:2026-03-20 13:24:13.867707041 +0000 UTC m=+3.545430436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.994394 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f793abf04a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.069965984 +0000 UTC m=+3.747689369,LastTimestamp:2026-03-20 13:24:14.069965984 +0000 UTC m=+3.747689369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:34 crc kubenswrapper[4849]: E0320 13:24:34.996008 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f793b80e884 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.082672772 +0000 UTC m=+3.760396157,LastTimestamp:2026-03-20 13:24:14.082672772 +0000 UTC m=+3.760396157,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.000036 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f793c100abe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.092053182 +0000 UTC m=+3.769776577,LastTimestamp:2026-03-20 13:24:14.092053182 +0000 UTC m=+3.769776577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.004330 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79474282a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.279910048 +0000 UTC m=+3.957633443,LastTimestamp:2026-03-20 13:24:14.279910048 +0000 UTC m=+3.957633443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.008554 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79484ac154 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.297227604 +0000 UTC m=+3.974950999,LastTimestamp:2026-03-20 13:24:14.297227604 +0000 UTC m=+3.974950999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.013106 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f797801e035 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.097757749 +0000 UTC m=+4.775481144,LastTimestamp:2026-03-20 13:24:15.097757749 +0000 UTC m=+4.775481144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.017365 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79837c22f5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.290319605 +0000 UTC m=+4.968043000,LastTimestamp:2026-03-20 13:24:15.290319605 +0000 UTC m=+4.968043000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.022297 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f7983e7edab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.297383851 +0000 UTC m=+4.975107256,LastTimestamp:2026-03-20 13:24:15.297383851 +0000 UTC m=+4.975107256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.026233 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f7983f75988 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.298394504 +0000 UTC m=+4.976117899,LastTimestamp:2026-03-20 13:24:15.298394504 +0000 UTC m=+4.976117899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.030363 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f798de9b8a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.465273507 +0000 UTC m=+5.142996902,LastTimestamp:2026-03-20 13:24:15.465273507 +0000 UTC m=+5.142996902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.034477 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f798e9972f7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.476790007 +0000 UTC m=+5.154513402,LastTimestamp:2026-03-20 13:24:15.476790007 +0000 UTC m=+5.154513402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.038614 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f798ea6b1fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.477658108 +0000 UTC m=+5.155381503,LastTimestamp:2026-03-20 13:24:15.477658108 +0000 UTC m=+5.155381503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.043081 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79985068da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.63977545 +0000 UTC m=+5.317498845,LastTimestamp:2026-03-20 13:24:15.63977545 +0000 UTC m=+5.317498845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.047438 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f7998f22074 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.650373748 +0000 UTC m=+5.328097153,LastTimestamp:2026-03-20 13:24:15.650373748 +0000 UTC m=+5.328097153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.051032 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f799900529a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.65130409 +0000 UTC m=+5.329027485,LastTimestamp:2026-03-20 13:24:15.65130409 +0000 UTC m=+5.329027485,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.054894 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79a4723bcd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.843318733 +0000 UTC m=+5.521042148,LastTimestamp:2026-03-20 13:24:15.843318733 +0000 UTC m=+5.521042148,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.059381 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79a516ef0e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.854112526 +0000 UTC m=+5.531835941,LastTimestamp:2026-03-20 13:24:15.854112526 +0000 UTC m=+5.531835941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.064396 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79a52907aa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:15.855298474 +0000 UTC m=+5.533021869,LastTimestamp:2026-03-20 13:24:15.855298474 +0000 UTC m=+5.533021869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.068312 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79b0855932 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:16.045898034 +0000 UTC m=+5.723621469,LastTimestamp:2026-03-20 13:24:16.045898034 +0000 UTC m=+5.723621469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.072973 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f79b136e65f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:16.057534047 +0000 UTC m=+5.735257452,LastTimestamp:2026-03-20 13:24:16.057534047 +0000 UTC m=+5.735257452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.079334 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:24:35 crc kubenswrapper[4849]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f7ae2b10823 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 13:24:35 crc kubenswrapper[4849]: body: Mar 20 13:24:35 crc kubenswrapper[4849]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:21.182588963 +0000 UTC m=+10.860312358,LastTimestamp:2026-03-20 13:24:21.182588963 +0000 UTC m=+10.860312358,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:24:35 crc kubenswrapper[4849]: > Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.083303 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f7ae2b1f9f1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:21.182650865 +0000 UTC m=+10.860374260,LastTimestamp:2026-03-20 13:24:21.182650865 +0000 UTC m=+10.860374260,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.088171 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:24:35 crc kubenswrapper[4849]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f7bb3063d4f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:24:35 crc kubenswrapper[4849]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 20 13:24:35 crc kubenswrapper[4849]: Mar 20 13:24:35 crc kubenswrapper[4849]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:24.677834063 +0000 UTC m=+14.355557458,LastTimestamp:2026-03-20 13:24:24.677834063 +0000 UTC m=+14.355557458,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:24:35 crc kubenswrapper[4849]: > Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.093134 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7bb306ea5c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:24.677878364 +0000 UTC m=+14.355601759,LastTimestamp:2026-03-20 13:24:24.677878364 +0000 UTC m=+14.355601759,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.097055 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:24:35 crc kubenswrapper[4849]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f7bb43c880b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 20 13:24:35 crc kubenswrapper[4849]: body: [+]ping ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]log ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]etcd ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-apiextensions-informers ok Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 13:24:35 crc kubenswrapper[4849]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]autoregister-completion ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 13:24:35 crc kubenswrapper[4849]: livez check failed Mar 20 13:24:35 crc kubenswrapper[4849]: Mar 20 13:24:35 crc kubenswrapper[4849]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:24.698169355 +0000 UTC m=+14.375892750,LastTimestamp:2026-03-20 13:24:24.698169355 +0000 UTC m=+14.375892750,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:24:35 crc kubenswrapper[4849]: > Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.101977 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f7bb43d08ea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:24.698202346 +0000 UTC m=+14.375925741,LastTimestamp:2026-03-20 13:24:24.698202346 +0000 UTC m=+14.375925741,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.107023 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f792eb0caa1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f792eb0caa1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:13.867707041 +0000 UTC m=+3.545430436,LastTimestamp:2026-03-20 13:24:25.138284022 +0000 UTC m=+14.816007417,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.110762 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f793abf04a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f793abf04a0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.069965984 +0000 UTC m=+3.747689369,LastTimestamp:2026-03-20 13:24:25.298550926 +0000 UTC m=+14.976274321,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.114635 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f793b80e884\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f793b80e884 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:14.082672772 +0000 UTC m=+3.760396157,LastTimestamp:2026-03-20 13:24:25.319467624 +0000 UTC m=+14.997191019,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.126446 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:24:35 crc kubenswrapper[4849]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f7d36cace5b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:24:35 crc kubenswrapper[4849]: body: Mar 20 13:24:35 crc kubenswrapper[4849]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:31.183498843 +0000 UTC m=+20.861222238,LastTimestamp:2026-03-20 13:24:31.183498843 +0000 UTC m=+20.861222238,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:24:35 crc kubenswrapper[4849]: > Mar 20 13:24:35 crc kubenswrapper[4849]: E0320 13:24:35.130379 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f7d36cbd162 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:31.183565154 +0000 UTC m=+20.861288549,LastTimestamp:2026-03-20 13:24:31.183565154 +0000 UTC m=+20.861288549,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:35 crc kubenswrapper[4849]: I0320 13:24:35.987053 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:36 crc kubenswrapper[4849]: I0320 13:24:36.986955 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:37 crc kubenswrapper[4849]: I0320 13:24:37.985842 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:38 crc kubenswrapper[4849]: I0320 13:24:38.078126 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:38 crc kubenswrapper[4849]: I0320 13:24:38.079284 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4849]: I0320 13:24:38.079318 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4849]: I0320 13:24:38.079329 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4849]: I0320 13:24:38.079353 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:38 crc kubenswrapper[4849]: E0320 13:24:38.080056 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:24:38 crc kubenswrapper[4849]: E0320 13:24:38.080599 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:24:38 crc kubenswrapper[4849]: I0320 13:24:38.987306 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:39 crc kubenswrapper[4849]: W0320 13:24:39.455285 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:24:39 crc kubenswrapper[4849]: E0320 13:24:39.455423 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:39 crc kubenswrapper[4849]: I0320 13:24:39.985198 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:40 crc kubenswrapper[4849]: I0320 13:24:40.985433 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.100389 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.183278 4849 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.183359 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.183424 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.183593 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.184577 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.184600 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.184607 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.184965 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.185117 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5" gracePeriod=30 Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.191552 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f7d36cace5b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:24:41 crc kubenswrapper[4849]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f7d36cace5b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:24:41 crc kubenswrapper[4849]: body: Mar 20 13:24:41 crc kubenswrapper[4849]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:31.183498843 +0000 UTC m=+20.861222238,LastTimestamp:2026-03-20 13:24:41.183332118 +0000 UTC m=+30.861055553,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:24:41 crc kubenswrapper[4849]: > Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.198465 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f7d36cbd162\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f7d36cbd162 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:31.183565154 +0000 UTC m=+20.861288549,LastTimestamp:2026-03-20 13:24:41.183393129 +0000 UTC m=+30.861116564,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.203234 4849 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f7f8aef3c50 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:41.185107024 +0000 UTC m=+30.862830449,LastTimestamp:2026-03-20 13:24:41.185107024 +0000 UTC m=+30.862830449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.311595 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f78c64ba020\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78c64ba020 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.11624656 +0000 UTC m=+1.793969955,LastTimestamp:2026-03-20 13:24:41.306754918 +0000 UTC m=+30.984478313,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.451393 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f78d82f5450\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78d82f5450 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.416382032 +0000 UTC m=+2.094105427,LastTimestamp:2026-03-20 13:24:41.446941796 +0000 UTC m=+31.124665201,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.463174 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f78d8eae85a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f78d8eae85a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:12.428675162 +0000 UTC m=+2.106398587,LastTimestamp:2026-03-20 13:24:41.458405606 +0000 UTC m=+31.136129001,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:41 crc kubenswrapper[4849]: W0320 13:24:41.557916 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 13:24:41 crc kubenswrapper[4849]: E0320 13:24:41.557972 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:41 crc kubenswrapper[4849]: I0320 13:24:41.986456 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.183803 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.184118 4849 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5" exitCode=255 Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.184147 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5"} Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.184183 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5"} Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.184284 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.185032 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.185065 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.185077 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.868072 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:42 crc kubenswrapper[4849]: I0320 13:24:42.986305 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:43 crc kubenswrapper[4849]: I0320 13:24:43.186338 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:43 crc kubenswrapper[4849]: I0320 13:24:43.186970 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4849]: I0320 13:24:43.187023 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4849]: I0320 13:24:43.187032 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4849]: I0320 13:24:43.986645 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:44 crc kubenswrapper[4849]: I0320 13:24:44.988954 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:45 crc kubenswrapper[4849]: I0320 13:24:45.080891 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:45 crc kubenswrapper[4849]: I0320 13:24:45.082768 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4849]: I0320 13:24:45.082804 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4849]: I0320 13:24:45.082839 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4849]: I0320 13:24:45.082864 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:45 crc kubenswrapper[4849]: E0320 13:24:45.086457 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:24:45 crc kubenswrapper[4849]: E0320 13:24:45.086715 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:24:45 crc kubenswrapper[4849]: I0320 13:24:45.989120 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.035095 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.036156 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.036190 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.036201 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.036794 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.197549 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.200979 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76"} Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.201105 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.201802 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.201860 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.201871 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4849]: W0320 13:24:46.243736 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:24:46 crc kubenswrapper[4849]: E0320 13:24:46.243791 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:46 crc kubenswrapper[4849]: I0320 13:24:46.986292 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.237921 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.239110 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.241289 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" exitCode=255 Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.241366 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76"} Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.241480 4849 scope.go:117] "RemoveContainer" containerID="8999a3533457144144604dfd3cc14b29e2ffeb7f299ff9562da2f1367f7239eb" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.241761 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.243090 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.243129 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.243143 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.243697 4849 scope.go:117] "RemoveContainer" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" Mar 20 13:24:47 crc kubenswrapper[4849]: E0320 13:24:47.243869 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:47 crc kubenswrapper[4849]: I0320 13:24:47.987140 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.109946 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.183065 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.183211 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.184276 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.184323 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.184336 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.246578 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.248760 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.249868 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.249913 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.249924 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.250531 4849 scope.go:117] "RemoveContainer" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" Mar 20 13:24:48 crc kubenswrapper[4849]: E0320 13:24:48.250724 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:48 crc kubenswrapper[4849]: I0320 13:24:48.986726 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:49 crc kubenswrapper[4849]: I0320 13:24:49.988412 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:50 crc kubenswrapper[4849]: I0320 13:24:50.983504 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:51 crc kubenswrapper[4849]: E0320 13:24:51.100513 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.183945 4849 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.184022 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:24:51 crc kubenswrapper[4849]: E0320 13:24:51.188276 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f7d36cace5b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:24:51 crc kubenswrapper[4849]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f7d36cace5b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:24:51 crc kubenswrapper[4849]: body: Mar 20 13:24:51 crc kubenswrapper[4849]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:31.183498843 +0000 UTC m=+20.861222238,LastTimestamp:2026-03-20 13:24:51.184004397 +0000 UTC m=+40.861727792,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:24:51 crc kubenswrapper[4849]: > Mar 20 13:24:51 crc kubenswrapper[4849]: E0320 13:24:51.192590 4849 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f7d36cbd162\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f7d36cbd162 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:24:31.183565154 +0000 UTC m=+20.861288549,LastTimestamp:2026-03-20 13:24:51.184047628 +0000 UTC m=+40.861771013,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.356870 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.357096 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.358274 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.358320 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.358337 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.359046 4849 scope.go:117] "RemoveContainer" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" Mar 20 13:24:51 crc kubenswrapper[4849]: E0320 13:24:51.359255 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:51 crc kubenswrapper[4849]: I0320 13:24:51.986324 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:52 crc kubenswrapper[4849]: I0320 13:24:52.087302 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:52 crc kubenswrapper[4849]: I0320 13:24:52.088911 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4849]: I0320 13:24:52.088983 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4849]: I0320 13:24:52.088995 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4849]: I0320 13:24:52.089022 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:52 crc kubenswrapper[4849]: E0320 13:24:52.091711 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:24:52 crc kubenswrapper[4849]: E0320 13:24:52.091996 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:24:52 crc kubenswrapper[4849]: I0320 13:24:52.987808 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:53 crc kubenswrapper[4849]: I0320 13:24:53.988243 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:54 crc kubenswrapper[4849]: I0320 13:24:54.987965 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:55 crc kubenswrapper[4849]: W0320 13:24:55.130613 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:24:55 crc kubenswrapper[4849]: E0320 13:24:55.130681 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:55 crc kubenswrapper[4849]: I0320 13:24:55.986279 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:56 crc kubenswrapper[4849]: I0320 13:24:56.987538 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:57 crc kubenswrapper[4849]: W0320 13:24:57.227766 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:57 crc kubenswrapper[4849]: E0320 13:24:57.227849 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:57 crc kubenswrapper[4849]: I0320 13:24:57.988922 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:58 crc kubenswrapper[4849]: I0320 13:24:58.987812 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.092003 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.093709 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.094046 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.094125 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.094224 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:59 crc kubenswrapper[4849]: E0320 13:24:59.097323 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:24:59 crc kubenswrapper[4849]: E0320 13:24:59.097428 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:24:59 crc kubenswrapper[4849]: W0320 13:24:59.872349 4849 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 13:24:59 crc kubenswrapper[4849]: E0320 13:24:59.872433 4849 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.925733 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.926189 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.928163 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.928224 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.928244 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.934044 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:24:59 crc kubenswrapper[4849]: I0320 13:24:59.986471 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:00 crc kubenswrapper[4849]: I0320 13:25:00.275541 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:00 crc kubenswrapper[4849]: I0320 13:25:00.276387 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4849]: I0320 13:25:00.276461 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4849]: I0320 13:25:00.276472 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4849]: I0320 13:25:00.986209 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:01 crc kubenswrapper[4849]: E0320 13:25:01.101272 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:25:01 crc kubenswrapper[4849]: I0320 13:25:01.986831 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:02 crc kubenswrapper[4849]: I0320 13:25:02.989631 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:03 crc kubenswrapper[4849]: I0320 13:25:03.988491 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:04 crc kubenswrapper[4849]: I0320 13:25:04.986379 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:05 crc kubenswrapper[4849]: I0320 13:25:05.034732 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:05 crc kubenswrapper[4849]: I0320 13:25:05.035922 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4849]: I0320 13:25:05.035964 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4849]: I0320 13:25:05.035978 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4849]: I0320 13:25:05.036583 4849 scope.go:117] "RemoveContainer" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" Mar 20 13:25:05 crc kubenswrapper[4849]: E0320 13:25:05.036754 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:05 crc kubenswrapper[4849]: I0320 13:25:05.989422 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.098002 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.099239 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.099280 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.099288 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.099312 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:25:06 crc kubenswrapper[4849]: E0320 13:25:06.102292 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:25:06 crc kubenswrapper[4849]: E0320 13:25:06.102542 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.112245 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.112385 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.113266 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.113302 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.113311 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:06 crc kubenswrapper[4849]: I0320 13:25:06.986027 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:07 crc kubenswrapper[4849]: I0320 13:25:07.985855 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:08 crc kubenswrapper[4849]: I0320 13:25:08.986410 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:09 crc kubenswrapper[4849]: I0320 13:25:09.988580 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:10 crc kubenswrapper[4849]: I0320 13:25:10.987694 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:11 crc kubenswrapper[4849]: E0320 13:25:11.101438 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:25:11 crc kubenswrapper[4849]: I0320 13:25:11.986738 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:12 crc kubenswrapper[4849]: I0320 13:25:12.985749 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:13 crc kubenswrapper[4849]: I0320 13:25:13.102995 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:13 crc kubenswrapper[4849]: I0320 13:25:13.104607 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:13 crc kubenswrapper[4849]: I0320 13:25:13.104629 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:13 crc kubenswrapper[4849]: I0320 13:25:13.104637 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:13 crc kubenswrapper[4849]: I0320 13:25:13.104658 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:25:13 crc kubenswrapper[4849]: E0320 13:25:13.106943 4849 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:25:13 crc kubenswrapper[4849]: E0320 13:25:13.107416 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:25:13 crc kubenswrapper[4849]: I0320 13:25:13.985674 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:14 crc kubenswrapper[4849]: I0320 13:25:14.987195 4849 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:25:15 crc kubenswrapper[4849]: I0320 13:25:15.429571 4849 csr.go:261] certificate signing request csr-7skw8 is approved, waiting to be issued Mar 20 13:25:15 crc kubenswrapper[4849]: I0320 13:25:15.436491 4849 csr.go:257] certificate signing request csr-7skw8 is issued Mar 20 13:25:15 crc kubenswrapper[4849]: I0320 13:25:15.543127 4849 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 13:25:15 crc kubenswrapper[4849]: I0320 13:25:15.868662 4849 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 13:25:16 crc kubenswrapper[4849]: I0320 13:25:16.008899 4849 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:16 crc kubenswrapper[4849]: I0320 13:25:16.438198 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-16 03:32:29.364018996 +0000 UTC Mar 20 13:25:16 crc kubenswrapper[4849]: I0320 13:25:16.438241 4849 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5774h7m12.925781227s for next certificate rotation Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.034745 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.035876 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.035913 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.035924 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.036595 4849 scope.go:117] "RemoveContainer" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.107609 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.108888 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.108932 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.108945 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.109053 4849 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.120924 4849 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.121250 4849 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.121317 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.124093 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.124132 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.124145 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.124162 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.124176 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:20Z","lastTransitionTime":"2026-03-20T13:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.139537 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.147244 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.147282 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.147291 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.147306 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.147318 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:20Z","lastTransitionTime":"2026-03-20T13:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.157666 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.162979 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.163000 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.163008 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.163022 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.163032 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:20Z","lastTransitionTime":"2026-03-20T13:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.174535 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.181429 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.181588 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.181673 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.181757 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.181870 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:20Z","lastTransitionTime":"2026-03-20T13:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.192479 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.192640 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.192672 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.293589 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.328787 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.331096 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9"} Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.331265 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.332152 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.332269 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:20 crc kubenswrapper[4849]: I0320 13:25:20.332344 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.394253 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.495559 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.596463 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.696551 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.797355 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.898110 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:20 crc kubenswrapper[4849]: E0320 13:25:20.998450 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.099062 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.102296 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.200116 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.300711 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.335890 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.336550 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.338740 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" exitCode=255 Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.338838 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9"} Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.338976 4849 scope.go:117] "RemoveContainer" containerID="4ea6e4d395076b71da4a396e50eddafe49841430885865db7cb363c88ae50b76" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.339135 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.340106 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.340169 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.340182 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.340801 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.341055 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:21 crc kubenswrapper[4849]: I0320 13:25:21.357006 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.401506 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.501669 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.601949 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.702981 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.803960 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:21 crc kubenswrapper[4849]: E0320 13:25:21.904628 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.005222 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.105883 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.207567 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.308718 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: I0320 13:25:22.343577 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:25:22 crc kubenswrapper[4849]: I0320 13:25:22.345598 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:22 crc kubenswrapper[4849]: I0320 13:25:22.346433 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:22 crc kubenswrapper[4849]: I0320 13:25:22.346489 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:22 crc kubenswrapper[4849]: I0320 13:25:22.346508 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:22 crc kubenswrapper[4849]: I0320 13:25:22.347702 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.348283 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.409066 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.509194 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.609712 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.710428 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.811392 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:22 crc kubenswrapper[4849]: E0320 13:25:22.912283 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.013201 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.113486 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.214511 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.315407 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.416235 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.516545 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.617511 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.718347 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.819237 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:23 crc kubenswrapper[4849]: E0320 13:25:23.920228 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.020811 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.121392 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.221736 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.322573 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.423580 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.523735 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.624878 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.725686 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.825765 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:24 crc kubenswrapper[4849]: E0320 13:25:24.926226 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.027101 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.127442 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.227845 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.328628 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.429515 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.529803 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.630292 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.731265 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.831370 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:25 crc kubenswrapper[4849]: E0320 13:25:25.932223 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.032867 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.133776 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.233908 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.334244 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: I0320 13:25:26.393490 4849 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.435161 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.536135 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.636326 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.736877 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.837873 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:26 crc kubenswrapper[4849]: E0320 13:25:26.938398 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.039442 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.140471 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.241617 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.342580 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.443380 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.543706 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.644875 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.746056 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.846676 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:27 crc kubenswrapper[4849]: E0320 13:25:27.947051 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.048156 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: I0320 13:25:28.110335 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:28 crc kubenswrapper[4849]: I0320 13:25:28.110517 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:28 crc kubenswrapper[4849]: I0320 13:25:28.111464 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:28 crc kubenswrapper[4849]: I0320 13:25:28.111515 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:28 crc kubenswrapper[4849]: I0320 13:25:28.111529 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:28 crc kubenswrapper[4849]: I0320 13:25:28.112127 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.112272 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.148840 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.249960 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.351072 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.451756 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.552992 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.654031 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.755057 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.856105 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:28 crc kubenswrapper[4849]: E0320 13:25:28.956441 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.057718 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.158873 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.259895 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.360770 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.461950 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.562604 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.663601 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.764465 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.865307 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:29 crc kubenswrapper[4849]: E0320 13:25:29.965569 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.066022 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.167268 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.252214 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.256387 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.256426 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.256437 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.256454 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.256464 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:30Z","lastTransitionTime":"2026-03-20T13:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.266736 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.269620 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.269656 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.269672 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.269688 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.269700 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:30Z","lastTransitionTime":"2026-03-20T13:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.278218 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.282152 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.282210 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.282220 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.282235 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.282245 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:30Z","lastTransitionTime":"2026-03-20T13:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.290853 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.294248 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.294300 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.294311 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.294326 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:30 crc kubenswrapper[4849]: I0320 13:25:30.294335 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:30Z","lastTransitionTime":"2026-03-20T13:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.303173 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.303348 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.303393 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.404199 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.504605 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.605590 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.706263 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.806976 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:30 crc kubenswrapper[4849]: E0320 13:25:30.907914 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.008550 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.103444 4849 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.108626 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.209737 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.310522 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.411179 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.511471 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.611681 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.711990 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.813180 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:31 crc kubenswrapper[4849]: E0320 13:25:31.913949 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.015055 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.115419 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.216638 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.317741 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.418532 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.519196 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.619682 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.720016 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.820891 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:32 crc kubenswrapper[4849]: E0320 13:25:32.921170 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.021661 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.122417 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.223015 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.323990 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.424558 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.525558 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.626602 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.727603 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.828681 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:33 crc kubenswrapper[4849]: E0320 13:25:33.929346 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.029976 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.130749 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.231788 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.332698 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.433441 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.534179 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.634791 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.735122 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.835634 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:34 crc kubenswrapper[4849]: E0320 13:25:34.935895 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.035978 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.136837 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.237744 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.338489 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.439246 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.539717 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.640617 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.741686 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.842905 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:35 crc kubenswrapper[4849]: E0320 13:25:35.943385 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.043753 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.144851 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.245791 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.346945 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.447385 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.548061 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.648340 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.748658 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.849334 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:36 crc kubenswrapper[4849]: E0320 13:25:36.949935 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: I0320 13:25:37.035330 4849 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:25:37 crc kubenswrapper[4849]: I0320 13:25:37.036488 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:37 crc kubenswrapper[4849]: I0320 13:25:37.036563 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:37 crc kubenswrapper[4849]: I0320 13:25:37.036577 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.050092 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.150634 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.250756 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.350912 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.451966 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.553149 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.653660 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.754527 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.855618 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:37 crc kubenswrapper[4849]: E0320 13:25:37.956522 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: E0320 13:25:38.057523 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: E0320 13:25:38.158021 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: E0320 13:25:38.258958 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: E0320 13:25:38.359893 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: E0320 13:25:38.461036 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: E0320 13:25:38.562157 4849 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.608220 4849 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.664417 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.664454 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.664464 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.664477 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.664488 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:38Z","lastTransitionTime":"2026-03-20T13:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.766649 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.766901 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.766984 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.767074 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.767146 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:38Z","lastTransitionTime":"2026-03-20T13:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.869507 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.869748 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.869834 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.869937 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.870054 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:38Z","lastTransitionTime":"2026-03-20T13:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.972081 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.972328 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.972412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.972487 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:38 crc kubenswrapper[4849]: I0320 13:25:38.972549 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:38Z","lastTransitionTime":"2026-03-20T13:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.022288 4849 apiserver.go:52] "Watching apiserver" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.029152 4849 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.029721 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-dns/node-resolver-w7shr","openshift-multus/network-metrics-daemon-vm768","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/iptables-alerter-4ln5h","openshift-image-registry/node-ca-w65sz","openshift-machine-config-operator/machine-config-daemon-2pzdl","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-multus/multus-7nxh7","openshift-multus/multus-additional-cni-plugins-7cs2t","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw","openshift-ovn-kubernetes/ovnkube-node-7z7ql"] Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.030316 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.030384 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.030506 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.030554 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.030767 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.030855 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.031101 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.031203 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.031235 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.031264 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.031502 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.031590 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.031644 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.031756 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.032037 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.032071 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.032174 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.032498 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.033213 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.033731 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.034096 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.035797 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.035859 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.035878 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.035864 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.035975 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036102 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036235 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036240 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036245 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036434 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036618 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036699 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036880 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036885 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036918 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.036928 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037417 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037512 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037529 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037578 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037587 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037622 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037633 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037711 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037719 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037722 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037511 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037800 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037845 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037898 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.037801 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.038072 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.038168 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.038508 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.052023 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.061444 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.068916 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.074633 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.074664 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.074672 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.074685 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.074695 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.080942 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.090905 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.091053 4849 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098220 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098272 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098296 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098319 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098339 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098359 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098378 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098397 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098418 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098440 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098461 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098480 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098499 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098518 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098537 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098558 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098584 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098611 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098632 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098653 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098675 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098696 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098720 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098742 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098802 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098847 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098871 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098892 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098913 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098936 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098956 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098976 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099000 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099021 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099043 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099068 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099091 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099111 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099130 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099153 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099217 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099243 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099268 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099291 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099316 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099347 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099369 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099392 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099418 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099441 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099520 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099546 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099568 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099588 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099611 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099634 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099655 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099677 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099701 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099724 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099745 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099769 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099790 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099833 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099862 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099885 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099906 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099928 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099954 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099978 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100002 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100024 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100047 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100070 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100092 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100114 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100136 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100156 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100176 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100195 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100218 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100238 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100259 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100280 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100302 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100326 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100351 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100374 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100395 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100417 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100438 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100460 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100482 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100502 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100524 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100546 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100572 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100596 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100620 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100645 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100666 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100688 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100707 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100757 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100778 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100812 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100884 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100959 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100983 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101007 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101029 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101053 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101076 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101099 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101124 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101146 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101358 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101382 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101405 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101441 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101467 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101490 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101512 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101539 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101562 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101589 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101615 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101642 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101669 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101695 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101719 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101742 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101764 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101788 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101809 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101848 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101878 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101899 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101922 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101981 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102009 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102032 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102056 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102081 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102107 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102133 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102158 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102184 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102208 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102233 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102258 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102285 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102277 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102354 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098735 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.098975 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099257 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099288 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099292 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099371 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099481 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.099716 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100250 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100494 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.100538 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101476 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101596 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101898 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103605 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.101954 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102118 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102273 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102325 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102425 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.102457 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103389 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103576 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103616 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.103704 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:39.603684563 +0000 UTC m=+89.281408048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103431 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103770 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103797 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103837 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103857 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103874 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103893 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103912 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103928 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103926 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.103946 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104012 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104042 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104099 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104110 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104129 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104136 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104160 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104184 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104211 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104237 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104259 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104277 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104281 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104313 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104314 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104315 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104365 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104385 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104390 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104405 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104500 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104567 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104561 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104606 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104640 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104658 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104670 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104670 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104850 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.105124 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.105588 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.105649 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.104676 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.105698 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.105719 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.105838 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106119 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106162 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106182 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106203 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106228 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106252 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106270 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106292 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106311 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106326 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106342 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106360 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106415 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-env-overrides\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106436 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-script-lib\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106453 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-cni-multus\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106478 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106498 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-netns\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106516 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6656\" (UniqueName: \"kubernetes.io/projected/24edd4aa-ec92-450e-97bc-400c2a0171f0-kube-api-access-v6656\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106549 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-bin\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106578 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-socket-dir-parent\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106595 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-cnibin\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106613 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/886ff165-f013-40a8-a6c1-92a16f6b00ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106628 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-node-log\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106645 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkwjw\" (UniqueName: \"kubernetes.io/projected/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-kube-api-access-kkwjw\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106660 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/423277f6-8ff5-40a2-90a2-6e8b09c16b46-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106677 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6c5\" (UniqueName: \"kubernetes.io/projected/423277f6-8ff5-40a2-90a2-6e8b09c16b46-kube-api-access-4f6c5\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106692 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-systemd-units\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106708 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-var-lib-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106730 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106748 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-log-socket\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106776 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-kubelet\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106811 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106849 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsdf\" (UniqueName: \"kubernetes.io/projected/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-kube-api-access-6xsdf\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106869 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106887 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzwzq\" (UniqueName: \"kubernetes.io/projected/8ca35818-87a2-4dac-ad57-310ffe701961-kube-api-access-rzwzq\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106904 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106919 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ba9a25c-6156-4c78-a394-60507829eced-ovn-node-metrics-cert\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106934 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-cni-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106949 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-cnibin\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106967 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/24edd4aa-ec92-450e-97bc-400c2a0171f0-serviceca\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106981 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-netns\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.106997 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107012 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-system-cni-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107027 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-etc-kubernetes\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107050 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107069 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-etc-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107090 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-proxy-tls\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107106 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-systemd\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107122 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-daemon-config\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107141 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107157 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/886ff165-f013-40a8-a6c1-92a16f6b00ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107174 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-ovn-kubernetes\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107191 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh57\" (UniqueName: \"kubernetes.io/projected/0ba9a25c-6156-4c78-a394-60507829eced-kube-api-access-7bh57\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107208 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/423277f6-8ff5-40a2-90a2-6e8b09c16b46-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107224 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/423277f6-8ff5-40a2-90a2-6e8b09c16b46-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107383 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107519 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107575 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107593 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107525 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107765 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107787 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-system-cni-dir\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107807 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-cni-bin\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107854 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107876 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107899 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107892 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107923 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-os-release\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107949 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24edd4aa-ec92-450e-97bc-400c2a0171f0-host\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107973 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-rootfs\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.107993 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-k8s-cni-cncf-io\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108013 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-hostroot\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108033 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d-hosts-file\") pod \"node-resolver-w7shr\" (UID: \"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\") " pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108053 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-ovn\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108071 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-slash\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108094 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-cni-binary-copy\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108117 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108140 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-os-release\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108171 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnrb\" (UniqueName: \"kubernetes.io/projected/6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d-kube-api-access-qjnrb\") pod \"node-resolver-w7shr\" (UID: \"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\") " pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108412 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108449 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108469 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108488 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-conf-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108503 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-multus-certs\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108557 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108681 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108839 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108856 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108971 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.108986 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.109137 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.109149 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.109341 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.109348 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.109520 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110061 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110100 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110133 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110139 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110154 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110271 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110498 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.110989 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-config\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111052 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-netd\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111090 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rzxv\" (UniqueName: \"kubernetes.io/projected/886ff165-f013-40a8-a6c1-92a16f6b00ae-kube-api-access-7rzxv\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111151 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-kubelet\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.111265 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111310 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.111330 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:39.611313637 +0000 UTC m=+89.289037052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111657 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111745 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.111804 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.112518 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.112547 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.112558 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.112732 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.112989 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.113008 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.113221 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.113509 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.113664 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.113666 4849 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.113767 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114118 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114175 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114201 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114213 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114532 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114650 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.114901 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.115057 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.115131 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.115247 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.115446 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.115915 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116125 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116268 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116642 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116669 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116725 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116859 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.124703 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.121875 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.125080 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.125611 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.125873 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.128610 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.129814 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.129956 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.130236 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116892 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.116994 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.117694 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.118154 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.119233 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.119466 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.119711 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.122051 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.122990 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.123575 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.123809 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.124158 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.124492 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.124514 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.117649 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.130765 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.131120 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.131240 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:39.63122456 +0000 UTC m=+89.308947955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.131449 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.131871 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.131963 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.132050 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.132916 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.133019 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:39.633004402 +0000 UTC m=+89.310727797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133354 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133376 4849 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133387 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133397 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133406 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133416 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133432 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133451 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133463 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133474 4849 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133489 4849 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133500 4849 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133509 4849 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133519 4849 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133528 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133539 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133551 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133564 4849 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133576 4849 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133585 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133594 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133603 4849 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133612 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133622 4849 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133631 4849 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133640 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133649 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133658 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133666 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133675 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133684 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133692 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133701 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133709 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133719 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133728 4849 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133736 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133745 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133753 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133763 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133773 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133791 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133803 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133814 4849 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.133846 4849 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.137938 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.138211 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.138487 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.138656 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.139001 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.139241 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.139382 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.140134 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.141028 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.141380 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.141547 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.142578 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.144789 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.144853 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.144868 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.144925 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:39.64490567 +0000 UTC m=+89.322629135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.144654 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.149701 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.149774 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.149858 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.150354 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.150404 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.150606 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.150621 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.150936 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151207 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151189 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151242 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151274 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151279 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151302 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151397 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151475 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151512 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151656 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151727 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.152158 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151789 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151842 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.151846 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.152304 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.152361 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.152518 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.152773 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.152973 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.153268 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.153375 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.153735 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.154304 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.154552 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.154552 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.154912 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.156256 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.157577 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.157868 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.158146 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.158227 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.158773 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.158772 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.158860 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.158874 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.159216 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.159360 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.159389 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.159450 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.159740 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.159745 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160028 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160044 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160065 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160146 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160286 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160304 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.160065 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.161876 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.161949 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.162548 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.163872 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.164093 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.164287 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.164326 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.164413 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.164742 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.167029 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.167879 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.172646 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.172976 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.175485 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.180890 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.181624 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.181752 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.181785 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.181797 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.181837 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.181849 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.191806 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.201442 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.208441 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.221709 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.234854 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rzxv\" (UniqueName: \"kubernetes.io/projected/886ff165-f013-40a8-a6c1-92a16f6b00ae-kube-api-access-7rzxv\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.234889 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-kubelet\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.234922 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-netd\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.234948 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.234963 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-netns\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.234977 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-env-overrides\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235007 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-script-lib\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235023 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-cni-multus\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235055 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6656\" (UniqueName: \"kubernetes.io/projected/24edd4aa-ec92-450e-97bc-400c2a0171f0-kube-api-access-v6656\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235112 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-bin\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235130 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-cnibin\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235159 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-netd\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235199 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-cni-multus\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235169 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-netns\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235216 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235173 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/886ff165-f013-40a8-a6c1-92a16f6b00ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235408 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-cnibin\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235439 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-bin\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235476 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-socket-dir-parent\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235518 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-systemd-units\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235549 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-socket-dir-parent\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235655 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-systemd-units\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.235640 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-kubelet\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236268 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-env-overrides\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236297 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/886ff165-f013-40a8-a6c1-92a16f6b00ae-cni-binary-copy\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236270 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-script-lib\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236605 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-var-lib-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236646 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-node-log\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236679 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkwjw\" (UniqueName: \"kubernetes.io/projected/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-kube-api-access-kkwjw\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236699 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-var-lib-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236704 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/423277f6-8ff5-40a2-90a2-6e8b09c16b46-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236723 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-node-log\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236788 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6c5\" (UniqueName: \"kubernetes.io/projected/423277f6-8ff5-40a2-90a2-6e8b09c16b46-kube-api-access-4f6c5\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.236989 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237016 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsdf\" (UniqueName: \"kubernetes.io/projected/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-kube-api-access-6xsdf\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237039 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-log-socket\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237059 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-kubelet\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237073 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-cnibin\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237087 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/24edd4aa-ec92-450e-97bc-400c2a0171f0-serviceca\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237103 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237118 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzwzq\" (UniqueName: \"kubernetes.io/projected/8ca35818-87a2-4dac-ad57-310ffe701961-kube-api-access-rzwzq\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237153 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237172 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ba9a25c-6156-4c78-a394-60507829eced-ovn-node-metrics-cert\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237185 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-cni-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237199 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-netns\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237222 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-etc-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237238 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237252 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-system-cni-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237296 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-etc-kubernetes\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237362 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-systemd\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237462 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-proxy-tls\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237486 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-daemon-config\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237514 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237543 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/886ff165-f013-40a8-a6c1-92a16f6b00ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237566 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-ovn-kubernetes\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237583 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh57\" (UniqueName: \"kubernetes.io/projected/0ba9a25c-6156-4c78-a394-60507829eced-kube-api-access-7bh57\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237661 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/423277f6-8ff5-40a2-90a2-6e8b09c16b46-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237678 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/423277f6-8ff5-40a2-90a2-6e8b09c16b46-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237735 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-netns\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237694 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-system-cni-dir\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237769 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-log-socket\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237869 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-kubelet\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237906 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-cnibin\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237933 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.237812 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-cni-bin\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238486 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24edd4aa-ec92-450e-97bc-400c2a0171f0-host\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238501 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-rootfs\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238518 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-os-release\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238543 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d-hosts-file\") pod \"node-resolver-w7shr\" (UID: \"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\") " pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238566 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-ovn\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238587 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-k8s-cni-cncf-io\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238602 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-hostroot\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238626 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-os-release\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238640 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnrb\" (UniqueName: \"kubernetes.io/projected/6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d-kube-api-access-qjnrb\") pod \"node-resolver-w7shr\" (UID: \"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\") " pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238653 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-slash\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238658 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.238764 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.238869 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:39.738847121 +0000 UTC m=+89.416570516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238867 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/423277f6-8ff5-40a2-90a2-6e8b09c16b46-env-overrides\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238932 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-cni-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.238667 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-cni-binary-copy\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240099 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-cni-binary-copy\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240149 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-etc-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240185 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-openvswitch\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240230 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-system-cni-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240242 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-etc-kubernetes\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240259 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-var-lib-cni-bin\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240275 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-systemd\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.240936 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/24edd4aa-ec92-450e-97bc-400c2a0171f0-serviceca\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.241742 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.241762 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-daemon-config\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.241910 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-ovn-kubernetes\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.242706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/423277f6-8ff5-40a2-90a2-6e8b09c16b46-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.242938 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-ovn\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.242987 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-system-cni-dir\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243024 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24edd4aa-ec92-450e-97bc-400c2a0171f0-host\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243052 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-rootfs\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243097 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-os-release\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243144 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d-hosts-file\") pod \"node-resolver-w7shr\" (UID: \"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\") " pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243187 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/886ff165-f013-40a8-a6c1-92a16f6b00ae-os-release\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243225 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-k8s-cni-cncf-io\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243389 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-slash\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243425 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-hostroot\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243459 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243503 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-config\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243530 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-conf-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243556 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-multus-certs\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243737 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243759 4849 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243770 4849 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243782 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243807 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243833 4849 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243845 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243867 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243884 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243909 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243921 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243932 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243942 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243954 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243966 4849 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243978 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.243988 4849 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244000 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244013 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244028 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244041 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244052 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244068 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244081 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244091 4849 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244103 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244115 4849 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244132 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244144 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244158 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244169 4849 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244181 4849 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244172 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ba9a25c-6156-4c78-a394-60507829eced-ovn-node-metrics-cert\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244220 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-host-run-multus-certs\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244256 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.245052 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-config\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.245104 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-multus-conf-dir\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.245329 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/886ff165-f013-40a8-a6c1-92a16f6b00ae-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.244192 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.246992 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247030 4849 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247061 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247083 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247103 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247124 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247143 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247162 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247179 4849 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247197 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247214 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247233 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.247252 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.249213 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.252197 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6c5\" (UniqueName: \"kubernetes.io/projected/423277f6-8ff5-40a2-90a2-6e8b09c16b46-kube-api-access-4f6c5\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.252923 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.252962 4849 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.252983 4849 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.252999 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253017 4849 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253034 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253048 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253064 4849 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253089 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253103 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253118 4849 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253131 4849 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253145 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253159 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253173 4849 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253238 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253255 4849 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253269 4849 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253284 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253305 4849 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253318 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253333 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253364 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253401 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253414 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253425 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253453 4849 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253493 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253503 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253513 4849 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253532 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253541 4849 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253551 4849 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253563 4849 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253571 4849 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253581 4849 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253591 4849 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253601 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253612 4849 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253624 4849 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253638 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253649 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253660 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253671 4849 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253682 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253698 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253709 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253727 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253736 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253744 4849 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253753 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253761 4849 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253769 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253778 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253786 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253799 4849 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253807 4849 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253831 4849 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253840 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253849 4849 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253858 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253866 4849 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253874 4849 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253882 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253895 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253906 4849 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253917 4849 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253929 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253939 4849 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253948 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253957 4849 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253964 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253973 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.253987 4849 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254002 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254013 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254024 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254035 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254043 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254051 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254059 4849 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254068 4849 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254079 4849 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254090 4849 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254101 4849 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254112 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254128 4849 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254138 4849 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254148 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254159 4849 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254170 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254189 4849 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254200 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254213 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254225 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254236 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254247 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254258 4849 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254268 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254350 4849 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.254361 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.257096 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6656\" (UniqueName: \"kubernetes.io/projected/24edd4aa-ec92-450e-97bc-400c2a0171f0-kube-api-access-v6656\") pod \"node-ca-w65sz\" (UID: \"24edd4aa-ec92-450e-97bc-400c2a0171f0\") " pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.257406 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh57\" (UniqueName: \"kubernetes.io/projected/0ba9a25c-6156-4c78-a394-60507829eced-kube-api-access-7bh57\") pod \"ovnkube-node-7z7ql\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.260677 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkwjw\" (UniqueName: \"kubernetes.io/projected/606dc5eb-f89f-41cb-8aa2-f55fcab8f04d-kube-api-access-kkwjw\") pod \"multus-7nxh7\" (UID: \"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\") " pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.263157 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsdf\" (UniqueName: \"kubernetes.io/projected/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-kube-api-access-6xsdf\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.265008 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzwzq\" (UniqueName: \"kubernetes.io/projected/8ca35818-87a2-4dac-ad57-310ffe701961-kube-api-access-rzwzq\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.266840 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rzxv\" (UniqueName: \"kubernetes.io/projected/886ff165-f013-40a8-a6c1-92a16f6b00ae-kube-api-access-7rzxv\") pod \"multus-additional-cni-plugins-7cs2t\" (UID: \"886ff165-f013-40a8-a6c1-92a16f6b00ae\") " pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.267093 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/423277f6-8ff5-40a2-90a2-6e8b09c16b46-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-g2gxw\" (UID: \"423277f6-8ff5-40a2-90a2-6e8b09c16b46\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.267439 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9aefa038-8804-4eff-b0a9-3d6ce4a47a6a-proxy-tls\") pod \"machine-config-daemon-2pzdl\" (UID: \"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\") " pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.269145 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnrb\" (UniqueName: \"kubernetes.io/projected/6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d-kube-api-access-qjnrb\") pod \"node-resolver-w7shr\" (UID: \"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\") " pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.283591 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.283624 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.283636 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.283651 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.283662 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.350894 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.361222 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.362224 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-08b7178999afeef9efc0f1a082e8c266456e34a55a367e56f2542f0a37dee54e WatchSource:0}: Error finding container 08b7178999afeef9efc0f1a082e8c266456e34a55a367e56f2542f0a37dee54e: Status 404 returned error can't find the container with id 08b7178999afeef9efc0f1a082e8c266456e34a55a367e56f2542f0a37dee54e Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.368673 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.375935 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-c46b55042ed4352d7a5db82fbce4a34ab52ba3385351a28d1be033e254c225dc WatchSource:0}: Error finding container c46b55042ed4352d7a5db82fbce4a34ab52ba3385351a28d1be033e254c225dc: Status 404 returned error can't find the container with id c46b55042ed4352d7a5db82fbce4a34ab52ba3385351a28d1be033e254c225dc Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.376083 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w65sz" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.383569 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.385326 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"08b7178999afeef9efc0f1a082e8c266456e34a55a367e56f2542f0a37dee54e"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.385893 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.385939 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.385951 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.385970 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.385983 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.386402 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c46b55042ed4352d7a5db82fbce4a34ab52ba3385351a28d1be033e254c225dc"} Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.389859 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c7d99f2030a6bfa253d6d7f64e2ae575dd1093fc4c844255c31cd0c7a12f7bd5 WatchSource:0}: Error finding container c7d99f2030a6bfa253d6d7f64e2ae575dd1093fc4c844255c31cd0c7a12f7bd5: Status 404 returned error can't find the container with id c7d99f2030a6bfa253d6d7f64e2ae575dd1093fc4c844255c31cd0c7a12f7bd5 Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.391752 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7nxh7" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.399215 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.411262 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7shr" Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.424270 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aefa038_8804_4eff_b0a9_3d6ce4a47a6a.slice/crio-d4c3402e2740f0384e4177c05fa9d045e7319f13bedd889878da6a3248d08ede WatchSource:0}: Error finding container d4c3402e2740f0384e4177c05fa9d045e7319f13bedd889878da6a3248d08ede: Status 404 returned error can't find the container with id d4c3402e2740f0384e4177c05fa9d045e7319f13bedd889878da6a3248d08ede Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.428205 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod606dc5eb_f89f_41cb_8aa2_f55fcab8f04d.slice/crio-d6bd7c03942aad75dbd26d6d36627b753c88c4a8f6c7cb70944a3339f71ad364 WatchSource:0}: Error finding container d6bd7c03942aad75dbd26d6d36627b753c88c4a8f6c7cb70944a3339f71ad364: Status 404 returned error can't find the container with id d6bd7c03942aad75dbd26d6d36627b753c88c4a8f6c7cb70944a3339f71ad364 Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.435150 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod886ff165_f013_40a8_a6c1_92a16f6b00ae.slice/crio-ff40a95d8d2f61526f16139fc37b0d435bc81b30f41e915bbe24bda1bb90e00d WatchSource:0}: Error finding container ff40a95d8d2f61526f16139fc37b0d435bc81b30f41e915bbe24bda1bb90e00d: Status 404 returned error can't find the container with id ff40a95d8d2f61526f16139fc37b0d435bc81b30f41e915bbe24bda1bb90e00d Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.435176 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.461740 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.489410 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.489443 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.489453 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.489467 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.489481 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.495022 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d29eb6f_a1dd_4217_8b0f_9bdf8b654b5d.slice/crio-36b65fd77ed74ed9af6fc936ab40df334781126385c65a25f4f643a873fbfb44 WatchSource:0}: Error finding container 36b65fd77ed74ed9af6fc936ab40df334781126385c65a25f4f643a873fbfb44: Status 404 returned error can't find the container with id 36b65fd77ed74ed9af6fc936ab40df334781126385c65a25f4f643a873fbfb44 Mar 20 13:25:39 crc kubenswrapper[4849]: W0320 13:25:39.498078 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ba9a25c_6156_4c78_a394_60507829eced.slice/crio-fa2cf4e0ac5699c8e56b34b95e16adc893344ac006dd28aa0c1c51d2ec475922 WatchSource:0}: Error finding container fa2cf4e0ac5699c8e56b34b95e16adc893344ac006dd28aa0c1c51d2ec475922: Status 404 returned error can't find the container with id fa2cf4e0ac5699c8e56b34b95e16adc893344ac006dd28aa0c1c51d2ec475922 Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.605680 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.606032 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.606041 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.606054 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.606064 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.659870 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.660028 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660066 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:40.660033094 +0000 UTC m=+90.337756499 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.660121 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.660161 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.660228 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660257 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660346 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:40.660329603 +0000 UTC m=+90.338052998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660448 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660474 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660487 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660555 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:40.660539029 +0000 UTC m=+90.338262494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660648 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660681 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660693 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660693 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660720 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:40.660711584 +0000 UTC m=+90.338435079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.660757 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:40.660731034 +0000 UTC m=+90.338454429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.708935 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.708983 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.708994 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.709012 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.709024 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.761260 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.761369 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: E0320 13:25:39.761413 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:40.761400232 +0000 UTC m=+90.439123627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.811912 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.811953 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.811963 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.811978 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.811987 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.914091 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.914132 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.914142 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.914157 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:39 crc kubenswrapper[4849]: I0320 13:25:39.914167 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:39Z","lastTransitionTime":"2026-03-20T13:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.016620 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.016663 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.016672 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.016714 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.016732 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.048757 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.048889 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.049007 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.119517 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.119553 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.119561 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.119575 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.119585 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.222022 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.222105 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.222130 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.222155 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.222175 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.324829 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.324861 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.324868 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.324880 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.324889 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.393202 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c7d99f2030a6bfa253d6d7f64e2ae575dd1093fc4c844255c31cd0c7a12f7bd5"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.394337 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" exitCode=0 Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.394404 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.394448 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"fa2cf4e0ac5699c8e56b34b95e16adc893344ac006dd28aa0c1c51d2ec475922"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.395788 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.395812 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.395845 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"d4c3402e2740f0384e4177c05fa9d045e7319f13bedd889878da6a3248d08ede"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.396962 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7shr" event={"ID":"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d","Type":"ContainerStarted","Data":"026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.397028 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7shr" event={"ID":"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d","Type":"ContainerStarted","Data":"36b65fd77ed74ed9af6fc936ab40df334781126385c65a25f4f643a873fbfb44"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.400239 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerStarted","Data":"26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.400283 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerStarted","Data":"d6bd7c03942aad75dbd26d6d36627b753c88c4a8f6c7cb70944a3339f71ad364"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.402183 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.403665 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.403706 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.404762 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" event={"ID":"423277f6-8ff5-40a2-90a2-6e8b09c16b46","Type":"ContainerStarted","Data":"2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.404802 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" event={"ID":"423277f6-8ff5-40a2-90a2-6e8b09c16b46","Type":"ContainerStarted","Data":"fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.404857 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" event={"ID":"423277f6-8ff5-40a2-90a2-6e8b09c16b46","Type":"ContainerStarted","Data":"4bdcb25f03da1f3c97901c8a5c0ccd63dbd986088f1f409bf92f1eb5fb2235e9"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.406095 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w65sz" event={"ID":"24edd4aa-ec92-450e-97bc-400c2a0171f0","Type":"ContainerStarted","Data":"f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.406166 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w65sz" event={"ID":"24edd4aa-ec92-450e-97bc-400c2a0171f0","Type":"ContainerStarted","Data":"57c73b3376eba1f8f55356133d749621b450bf17b39df1d16682d9ae9798b4b1"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.408602 4849 generic.go:334] "Generic (PLEG): container finished" podID="886ff165-f013-40a8-a6c1-92a16f6b00ae" containerID="2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5" exitCode=0 Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.409089 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerDied","Data":"2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.409165 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerStarted","Data":"ff40a95d8d2f61526f16139fc37b0d435bc81b30f41e915bbe24bda1bb90e00d"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.409129 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.409522 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.409922 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.426916 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.426952 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.426963 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.426981 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.426995 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.430954 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.442718 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.455777 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.476284 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.490934 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.504952 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.518776 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.529437 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.529463 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.529471 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.529484 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.529493 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.530710 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.546611 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.559051 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.575143 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.592435 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.605362 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.614573 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.634518 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.634548 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.634557 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.634570 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.634579 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.636044 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.652118 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.662474 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.679611 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.679737 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.679771 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.679809 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.679857 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.679969 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.679984 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.679994 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680036 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:42.68002319 +0000 UTC m=+92.357746585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680373 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:42.68036366 +0000 UTC m=+92.358087055 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680437 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680476 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:42.680467433 +0000 UTC m=+92.358190828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680534 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680544 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680552 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680573 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:42.680567276 +0000 UTC m=+92.358290671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680600 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.680620 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:42.680614238 +0000 UTC m=+92.358337633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.684202 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.688339 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.688384 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.688395 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.688411 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.688423 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.703201 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.704589 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.708013 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.708051 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.708067 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.708083 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.708092 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.715661 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.722027 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.726703 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.726742 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.726752 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.726773 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.726784 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.737764 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.738100 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.741366 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.741401 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.741411 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.741426 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.741438 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.751412 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.758244 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.761373 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.761396 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.761404 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.761417 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.761426 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.763516 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.772555 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.772708 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.774939 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.774965 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.774974 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.774986 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.774996 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.775666 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.781233 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.781363 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: E0320 13:25:40.781435 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:42.78141788 +0000 UTC m=+92.459141275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.786770 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.801171 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.810937 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.818888 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.830766 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.876681 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.876945 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.876955 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.876969 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.876978 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.979268 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.979304 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.979314 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.979329 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:40 crc kubenswrapper[4849]: I0320 13:25:40.979339 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:40Z","lastTransitionTime":"2026-03-20T13:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.035312 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.035339 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:41 crc kubenswrapper[4849]: E0320 13:25:41.035427 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.035460 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.035612 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:41 crc kubenswrapper[4849]: E0320 13:25:41.035629 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:41 crc kubenswrapper[4849]: E0320 13:25:41.035705 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:41 crc kubenswrapper[4849]: E0320 13:25:41.035975 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.039262 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.039949 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.041022 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.041636 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.042612 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.043162 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.043716 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.044638 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.045263 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.046189 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.046712 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.047769 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.048298 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.048775 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.051210 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.051710 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.052831 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.053269 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.053806 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.055313 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.057366 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.057946 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.059603 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.060231 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.061489 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.062105 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.062859 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.064621 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.065228 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.067273 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.067759 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.068255 4849 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.068700 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.070421 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.071858 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.072440 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.073442 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.075259 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.076077 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.077413 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.079179 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.082253 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.082294 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.082309 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.082325 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.082336 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.082846 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.083581 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.084717 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.085932 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.086528 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.086636 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.088589 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.091436 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.093063 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.095378 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.096217 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.097592 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.098422 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.098670 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.099338 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.102565 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.103569 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.144528 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.173977 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.185669 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.185722 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.185731 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.185746 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.185755 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.187988 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.204781 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.223438 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.236180 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.253581 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.266533 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.283855 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.287768 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.287810 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.287835 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.287850 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.287859 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.294043 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.303482 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.390073 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.390403 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.390412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.390426 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.390436 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.413042 4849 generic.go:334] "Generic (PLEG): container finished" podID="886ff165-f013-40a8-a6c1-92a16f6b00ae" containerID="98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f" exitCode=0 Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.413097 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerDied","Data":"98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.417512 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.417576 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.417592 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.428995 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.445413 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.461207 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.476350 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.492952 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.500305 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.500358 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.500368 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.500381 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.500391 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.507451 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.523177 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.538529 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.551698 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.572264 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.584537 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.600125 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.606603 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.606644 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.606653 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.606669 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.606678 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.613077 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.625319 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.638222 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.709374 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.709418 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.709430 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.709449 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.709464 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.811286 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.811327 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.811337 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.811361 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.811370 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.913974 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.914034 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.914045 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.914068 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:41 crc kubenswrapper[4849]: I0320 13:25:41.914080 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:41Z","lastTransitionTime":"2026-03-20T13:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.017130 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.017168 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.017176 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.017189 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.017201 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.119363 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.119588 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.119704 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.119834 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.119938 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.222716 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.222747 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.222755 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.222768 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.222776 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.324790 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.324860 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.324869 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.324883 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.324894 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.423649 4849 generic.go:334] "Generic (PLEG): container finished" podID="886ff165-f013-40a8-a6c1-92a16f6b00ae" containerID="f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849" exitCode=0 Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.423742 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerDied","Data":"f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.427408 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.427447 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.427460 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.427474 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.427485 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.430008 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.430102 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.430131 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.431511 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.445434 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.458396 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.467988 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.477514 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.499774 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.512640 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.522035 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.530521 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.530631 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.530647 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.530668 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.530682 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.533003 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.542753 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.554470 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.563687 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.571382 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.582657 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.594632 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.606653 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.617072 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.627617 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.632412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.632443 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.632451 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.632467 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.632476 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.650338 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.663287 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.673479 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.684150 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.694300 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.704610 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.705813 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.705920 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.705961 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.705980 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.706005 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706072 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706114 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:46.706085647 +0000 UTC m=+96.383809042 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706156 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:46.706147549 +0000 UTC m=+96.383870944 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706157 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706266 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:46.706245071 +0000 UTC m=+96.383968466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706125 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706310 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706324 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706192 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706378 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:46.706355385 +0000 UTC m=+96.384078770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706381 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706406 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.706461 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:46.706448867 +0000 UTC m=+96.384172372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.716935 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.728857 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.734785 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.734809 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.734832 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.734848 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.734861 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.740149 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.752432 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.763865 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.776372 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.790679 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.807180 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.807387 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: E0320 13:25:42.807476 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:46.807458875 +0000 UTC m=+96.485182260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.837030 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.837059 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.837070 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.837084 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.837095 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.939477 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.939526 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.939537 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.939553 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:42 crc kubenswrapper[4849]: I0320 13:25:42.939565 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:42Z","lastTransitionTime":"2026-03-20T13:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.035228 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.035339 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:43 crc kubenswrapper[4849]: E0320 13:25:43.035371 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.035422 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.035467 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:43 crc kubenswrapper[4849]: E0320 13:25:43.035531 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:43 crc kubenswrapper[4849]: E0320 13:25:43.035658 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:43 crc kubenswrapper[4849]: E0320 13:25:43.035795 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.041840 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.041866 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.041874 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.041888 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.041897 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.144586 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.144628 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.144640 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.144657 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.144669 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.247351 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.247404 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.247417 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.247433 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.247444 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.350070 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.350106 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.350116 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.350128 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.350138 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.438626 4849 generic.go:334] "Generic (PLEG): container finished" podID="886ff165-f013-40a8-a6c1-92a16f6b00ae" containerID="6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea" exitCode=0 Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.438744 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerDied","Data":"6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.453678 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.454700 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.454733 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.454751 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.454768 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.464480 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.477004 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.495444 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.507976 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.523510 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.536959 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.550791 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.556488 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.556648 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.556718 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.556797 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.556883 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.567130 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.587110 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.599287 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.610185 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.621048 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.629908 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.640352 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.650910 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.659308 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.659337 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.659349 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.659362 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.659371 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.761185 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.761219 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.761252 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.761268 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.761277 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.863658 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.863690 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.863699 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.863712 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.863721 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.972811 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.972868 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.972880 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.972894 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4849]: I0320 13:25:43.972904 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.074896 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.074926 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.074934 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.074948 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.074956 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.129719 4849 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.177253 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.177286 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.177296 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.177308 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.177317 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.279802 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.279859 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.279867 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.279882 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.279890 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.382663 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.382697 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.382704 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.382719 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.382726 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.444680 4849 generic.go:334] "Generic (PLEG): container finished" podID="886ff165-f013-40a8-a6c1-92a16f6b00ae" containerID="8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3" exitCode=0 Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.444754 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerDied","Data":"8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.448732 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.460904 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.476170 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.485930 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.485957 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.485966 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.485979 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.485987 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.493688 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.503986 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.517925 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.529339 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.545120 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.558448 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.579739 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.588551 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.588597 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.588610 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.588629 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.588641 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.592268 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.604134 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.616954 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.633705 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.647131 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.658366 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.691323 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.691379 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.691394 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.691412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.691425 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.793675 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.793729 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.793737 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.793752 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.793761 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.895515 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.895555 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.895566 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.895581 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.895591 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.997011 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.997049 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.997066 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.997083 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:44 crc kubenswrapper[4849]: I0320 13:25:44.997094 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:44Z","lastTransitionTime":"2026-03-20T13:25:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.035026 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.035105 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.035044 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:45 crc kubenswrapper[4849]: E0320 13:25:45.035153 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.035228 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:45 crc kubenswrapper[4849]: E0320 13:25:45.035341 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:45 crc kubenswrapper[4849]: E0320 13:25:45.035421 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:45 crc kubenswrapper[4849]: E0320 13:25:45.035494 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.100552 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.100637 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.100665 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.100699 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.100724 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.203412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.203470 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.203487 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.203509 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.203526 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.305355 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.305395 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.305406 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.305421 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.305432 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.407647 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.407686 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.407694 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.407713 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.407724 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.455462 4849 generic.go:334] "Generic (PLEG): container finished" podID="886ff165-f013-40a8-a6c1-92a16f6b00ae" containerID="39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4" exitCode=0 Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.455517 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerDied","Data":"39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.469854 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.480796 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.489575 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.502509 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.510889 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.510917 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.510927 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.510940 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.510948 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.516425 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.527099 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.550451 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.572156 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.591003 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.604452 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.616420 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.616464 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.616477 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.616495 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.616507 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.622550 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.635446 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.648007 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.658848 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.667573 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.718523 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.718553 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.718561 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.718575 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.718584 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.821216 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.821277 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.821286 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.821302 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.821320 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.924585 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.925206 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.925220 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.925251 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:45 crc kubenswrapper[4849]: I0320 13:25:45.925265 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:45Z","lastTransitionTime":"2026-03-20T13:25:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.027638 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.027682 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.027693 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.027715 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.027727 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.130128 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.130176 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.130187 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.130210 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.130222 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.233057 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.233102 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.233113 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.233131 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.233146 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.335442 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.335491 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.335503 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.335520 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.335534 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.438018 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.438052 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.438063 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.438077 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.438086 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.461932 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" event={"ID":"886ff165-f013-40a8-a6c1-92a16f6b00ae","Type":"ContainerStarted","Data":"7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.464993 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.483715 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.495635 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.506127 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.519702 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.540303 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.540340 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.540352 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.540368 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.540381 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.561556 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.575385 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.585262 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.597881 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.608955 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.617071 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.629206 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.639854 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.642190 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.642219 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.642231 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.642267 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.642280 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.650047 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.660041 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.669682 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.678733 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.687209 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.696654 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.708008 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.721317 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.732627 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.745010 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.745048 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.745059 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.745075 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.745087 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.749166 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.749404 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.749509 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.749542 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749572 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:54.749538956 +0000 UTC m=+104.427262351 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.749634 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749641 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749656 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.749663 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749666 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749705 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749680 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749731 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:54.749722382 +0000 UTC m=+104.427445867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749784 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749840 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749853 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749791 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:54.749777613 +0000 UTC m=+104.427501098 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749922 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:54.749900827 +0000 UTC m=+104.427624222 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.749959 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:54.749953249 +0000 UTC m=+104.427676644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.759041 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.768568 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.780029 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.793440 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.810598 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.822002 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.839605 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.847400 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.847428 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.847436 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.847448 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.847456 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.850150 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.850322 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: E0320 13:25:46.850397 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:54.850379549 +0000 UTC m=+104.528102944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.852316 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.950362 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.950396 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.950404 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.950418 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:46 crc kubenswrapper[4849]: I0320 13:25:46.950428 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:46Z","lastTransitionTime":"2026-03-20T13:25:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.035220 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.035264 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.035325 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:47 crc kubenswrapper[4849]: E0320 13:25:47.035428 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.035441 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:47 crc kubenswrapper[4849]: E0320 13:25:47.035532 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:47 crc kubenswrapper[4849]: E0320 13:25:47.035618 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:47 crc kubenswrapper[4849]: E0320 13:25:47.035724 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.052258 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.052284 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.052291 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.052303 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.052320 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.154431 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.154891 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.155070 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.155262 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.155397 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.258006 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.258238 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.258324 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.258394 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.258461 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.361162 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.361230 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.361254 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.361285 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.361310 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.463574 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.463646 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.463658 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.463676 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.463688 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.472115 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.472182 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.472204 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.566151 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.566195 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.566207 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.566223 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.566235 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.669133 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.669177 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.669194 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.669216 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.669232 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.772437 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.772762 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.772791 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.772852 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.772871 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.874480 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.874728 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.874805 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.874908 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.874982 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.977267 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.977516 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.977695 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.977780 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:47 crc kubenswrapper[4849]: I0320 13:25:47.977876 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:47Z","lastTransitionTime":"2026-03-20T13:25:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.080485 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.080715 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.080854 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.080956 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.081034 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.183324 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.183514 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.183570 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.183628 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.183709 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.285731 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.286020 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.286109 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.286184 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.286259 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.388574 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.388766 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.388837 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.388927 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.389015 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.491196 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.491800 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.491932 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.492010 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.492080 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.594537 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.595044 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.595132 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.595220 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.595303 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.595375 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:48 crc kubenswrapper[4849]: E0320 13:25:48.595512 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.595993 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:48 crc kubenswrapper[4849]: E0320 13:25:48.596064 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.596091 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.596117 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:48 crc kubenswrapper[4849]: E0320 13:25:48.596190 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:48 crc kubenswrapper[4849]: E0320 13:25:48.596286 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.603445 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.605238 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.619545 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.633582 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.648490 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.666886 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.677213 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.690319 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.698423 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.698465 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.698477 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.698495 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.698509 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.709460 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.722852 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.735051 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.751450 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.761687 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.769645 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.779308 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.788973 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.800870 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.800902 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.800914 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.800928 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.800939 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.806395 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.825407 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.838179 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.851582 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.864064 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.881518 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.892833 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.902135 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.903028 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.903046 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.903054 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.903066 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.903076 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:48Z","lastTransitionTime":"2026-03-20T13:25:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.914099 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.925209 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.938782 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.949996 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.960373 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.972969 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.983599 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:48 crc kubenswrapper[4849]: I0320 13:25:48.993647 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.005410 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.005443 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.005453 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.005472 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.005485 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.108265 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.108344 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.108358 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.108383 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.108395 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.216553 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.216661 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.216690 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.216725 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.216752 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.319995 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.320026 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.320066 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.320082 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.320091 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.422573 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.422613 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.422621 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.422635 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.422646 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.525154 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.525183 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.525192 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.525205 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.525213 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.627469 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.627506 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.627517 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.627532 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.627542 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.730076 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.730119 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.730131 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.730150 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.730163 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.832181 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.832229 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.832240 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.832256 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.832268 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.935360 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.935408 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.935419 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.935436 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:49 crc kubenswrapper[4849]: I0320 13:25:49.935461 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:49Z","lastTransitionTime":"2026-03-20T13:25:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.035263 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.035303 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.035343 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.035358 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.035431 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.036117 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.036333 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.036418 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.038141 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.038170 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.038179 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.038194 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.038203 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.140864 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.140925 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.140942 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.140963 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.140978 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.243282 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.243326 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.243342 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.243359 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.243372 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.345805 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.345874 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.345884 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.345902 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.345912 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.448030 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.448081 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.448092 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.448112 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.448123 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.550470 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.550621 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.550643 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.550672 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.550695 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.606280 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/0.log" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.608980 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4" exitCode=1 Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.609021 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.610000 4849 scope.go:117] "RemoveContainer" containerID="b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.625397 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.642846 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.653237 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.653307 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.653318 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.653364 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.653378 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.656700 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.668426 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.679145 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.692219 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.707123 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.721169 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.738991 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:49Z\\\",\\\"message\\\":\\\"rewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.676526 6720 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.676808 6720 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:25:49.677193 6720 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.677323 6720 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.677687 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:25:49.677721 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:49.677727 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:25:49.677760 6720 factory.go:656] Stopping watch factory\\\\nI0320 13:25:49.677778 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:49.677841 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:25:49.677853 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.750544 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.755812 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.755868 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.755905 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.755923 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.755934 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.761765 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.771052 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.780183 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.791856 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.796796 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.796851 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.796861 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.796874 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.796882 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.800610 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.807271 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.811286 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.811388 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.811472 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.811533 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.811592 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.830055 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.833650 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.833778 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.834016 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.834204 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.834425 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.846223 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.850048 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.850093 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.850104 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.850123 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.850138 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.861468 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.865311 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.865375 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.865393 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.865416 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.865433 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.877790 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:50 crc kubenswrapper[4849]: E0320 13:25:50.878154 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.879884 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.879982 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.880043 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.880107 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.880166 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.982796 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.982857 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.982872 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.982889 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:50 crc kubenswrapper[4849]: I0320 13:25:50.982902 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:50Z","lastTransitionTime":"2026-03-20T13:25:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.049914 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.060979 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.071518 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.082907 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.084807 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.084868 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.084879 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.084894 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.084905 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.096482 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.106418 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.122924 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.144336 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.158861 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.172996 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.187063 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.187126 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.187142 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.187226 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.187288 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.187747 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.201877 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.218492 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.232755 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.258708 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:49Z\\\",\\\"message\\\":\\\"rewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.676526 6720 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.676808 6720 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:25:49.677193 6720 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.677323 6720 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.677687 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:25:49.677721 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:49.677727 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:25:49.677760 6720 factory.go:656] Stopping watch factory\\\\nI0320 13:25:49.677778 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:49.677841 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:25:49.677853 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.290288 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.290330 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.290342 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.290357 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.290368 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.393149 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.393184 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.393194 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.393210 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.393219 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.500580 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.500627 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.500638 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.500654 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.500664 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.603244 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.603308 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.603325 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.603358 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.603374 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.614601 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/0.log" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.617935 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.618402 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.644147 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.660348 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.677928 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.691742 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.705305 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.705344 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.705353 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.705368 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.705379 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.714783 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:49Z\\\",\\\"message\\\":\\\"rewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.676526 6720 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.676808 6720 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:25:49.677193 6720 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.677323 6720 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.677687 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:25:49.677721 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:49.677727 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:25:49.677760 6720 factory.go:656] Stopping watch factory\\\\nI0320 13:25:49.677778 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:49.677841 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:25:49.677853 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.734256 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.744961 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.756362 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.767274 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.776025 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.787519 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.802252 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.807040 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.807072 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.807079 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.807094 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.807105 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.814974 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.828940 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.838345 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.909330 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.909369 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.909378 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.909393 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:51 crc kubenswrapper[4849]: I0320 13:25:51.909403 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:51Z","lastTransitionTime":"2026-03-20T13:25:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.011111 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.011148 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.011159 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.011174 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.011183 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.034957 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.035014 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:52 crc kubenswrapper[4849]: E0320 13:25:52.035058 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.035080 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:52 crc kubenswrapper[4849]: E0320 13:25:52.035186 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.035261 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:52 crc kubenswrapper[4849]: E0320 13:25:52.035280 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:52 crc kubenswrapper[4849]: E0320 13:25:52.035410 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.114246 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.114329 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.114370 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.114403 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.114421 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.216687 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.216720 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.216728 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.216743 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.216752 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.318328 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.318409 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.318421 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.318433 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.318441 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.421028 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.421081 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.421094 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.421112 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.421123 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.523583 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.523627 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.523638 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.523651 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.523659 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.622972 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/1.log" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.623789 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/0.log" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.624979 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.625027 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.625044 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.625066 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.625083 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.626217 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8" exitCode=1 Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.626249 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.626299 4849 scope.go:117] "RemoveContainer" containerID="b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.627326 4849 scope.go:117] "RemoveContainer" containerID="55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8" Mar 20 13:25:52 crc kubenswrapper[4849]: E0320 13:25:52.627607 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.640849 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.651249 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.661726 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.672672 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.683253 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.697783 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.711086 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.721275 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.727893 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.727931 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.727944 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.727962 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.727973 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.733053 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.743613 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.755490 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.767702 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.780065 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.789891 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.806281 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b93a35772d327f03cae89cd30a4807de330d29efcdfa73696278caed5fe771f4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:49Z\\\",\\\"message\\\":\\\"rewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.676526 6720 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.676808 6720 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:25:49.677193 6720 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:49.677323 6720 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:25:49.677687 6720 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:25:49.677721 6720 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:49.677727 6720 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:25:49.677760 6720 factory.go:656] Stopping watch factory\\\\nI0320 13:25:49.677778 6720 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:49.677841 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:25:49.677853 6720 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:52Z\\\",\\\"message\\\":\\\"lient/informers/externalversions/factory.go:117\\\\nI0320 13:25:52.009834 6872 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009969 6872 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009979 6872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:25:52.010060 6872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:52.010169 6872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:25:52.010172 6872 factory.go:656] Stopping watch factory\\\\nI0320 13:25:52.010190 6872 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:25:52.012730 6872 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:25:52.013013 6872 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:25:52.013064 6872 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:52.013096 6872 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:25:52.013204 6872 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.829654 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.829687 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.829697 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.829711 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.829720 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.931419 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.931459 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.931470 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.931486 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:52 crc kubenswrapper[4849]: I0320 13:25:52.931498 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:52Z","lastTransitionTime":"2026-03-20T13:25:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.033548 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.033581 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.033592 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.033610 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.033622 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.136082 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.136121 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.136129 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.136143 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.136154 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.238135 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.238490 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.238506 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.238525 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.238538 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.341445 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.341494 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.341512 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.341531 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.341539 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.443879 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.443920 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.443928 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.443942 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.443952 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.546454 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.546498 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.546510 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.546528 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.546538 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.631315 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/1.log" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.636271 4849 scope.go:117] "RemoveContainer" containerID="55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8" Mar 20 13:25:53 crc kubenswrapper[4849]: E0320 13:25:53.636602 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.655931 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.656234 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.656344 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.656447 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.656542 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.684948 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.704088 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.716367 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.726352 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.742645 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:52Z\\\",\\\"message\\\":\\\"lient/informers/externalversions/factory.go:117\\\\nI0320 13:25:52.009834 6872 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009969 6872 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009979 6872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:25:52.010060 6872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:52.010169 6872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:25:52.010172 6872 factory.go:656] Stopping watch factory\\\\nI0320 13:25:52.010190 6872 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:25:52.012730 6872 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:25:52.013013 6872 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:25:52.013064 6872 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:52.013096 6872 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:25:52.013204 6872 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.755586 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.759042 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.759061 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.759069 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.759105 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.759114 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.767382 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.776892 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.786308 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.798409 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.809895 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.825532 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.844350 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.856331 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.861180 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.861244 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.861261 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.861688 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.861743 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.867673 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.964543 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.964907 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.965083 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.965247 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4849]: I0320 13:25:53.965404 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.035126 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.035156 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.035186 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.035303 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.035342 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.035436 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.035520 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.035576 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.036172 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.036441 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.067916 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.068015 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.068034 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.068059 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.068076 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.170985 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.171040 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.171052 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.171071 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.171083 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.273486 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.273536 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.273546 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.273562 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.273571 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.376023 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.376053 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.376061 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.376094 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.376103 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.479389 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.479449 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.479466 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.479488 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.479503 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.581402 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.581441 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.581452 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.581466 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.581478 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.684430 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.684475 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.684486 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.684503 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.684515 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.754038 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754155 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:10.754132905 +0000 UTC m=+120.431856320 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.754195 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.754279 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.754322 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.754376 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754423 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754454 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754472 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754480 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754527 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:10.754508676 +0000 UTC m=+120.432232081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754550 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:10.754539217 +0000 UTC m=+120.432262632 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754544 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754641 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:10.75461691 +0000 UTC m=+120.432340315 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754655 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754703 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754724 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.754801 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:10.754778144 +0000 UTC m=+120.432501569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.786575 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.786611 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.786624 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.786642 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.786653 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.855797 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.855976 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: E0320 13:25:54.856065 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:10.856044979 +0000 UTC m=+120.533768384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.888657 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.888705 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.888720 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.888778 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.888793 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.990716 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.991083 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.991228 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.991365 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:54 crc kubenswrapper[4849]: I0320 13:25:54.991493 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:54Z","lastTransitionTime":"2026-03-20T13:25:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.094390 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.094442 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.094455 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.094472 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.094484 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.196634 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.196664 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.196673 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.196687 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.196697 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.299384 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.299419 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.299431 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.299447 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.299457 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.401724 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.401766 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.401775 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.401789 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.401804 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.504444 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.504490 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.504502 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.504519 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.504531 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.606533 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.606811 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.606907 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.606972 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.607035 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.710400 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.711444 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.711700 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.712093 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.712274 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.815148 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.815222 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.815238 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.815261 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.815279 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.918261 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.918301 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.918312 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.918327 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:55 crc kubenswrapper[4849]: I0320 13:25:55.918337 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:55Z","lastTransitionTime":"2026-03-20T13:25:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.021200 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.021265 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.021286 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.021311 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.021329 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.035751 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.035855 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:56 crc kubenswrapper[4849]: E0320 13:25:56.036037 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.036099 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.036135 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:56 crc kubenswrapper[4849]: E0320 13:25:56.036273 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:56 crc kubenswrapper[4849]: E0320 13:25:56.036434 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:56 crc kubenswrapper[4849]: E0320 13:25:56.036571 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.123866 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.123903 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.123914 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.123928 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.123937 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.226646 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.226698 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.226708 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.226722 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.226734 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.330762 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.330891 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.330914 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.330938 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.330954 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.433901 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.433981 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.433992 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.434018 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.434035 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.536732 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.536783 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.536793 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.536808 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.536846 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.640405 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.640504 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.640531 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.640570 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.640599 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.743890 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.743976 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.743997 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.744028 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.744050 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.847397 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.847458 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.847471 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.847494 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.847509 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.950210 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.950259 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.950272 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.950292 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:56 crc kubenswrapper[4849]: I0320 13:25:56.950306 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:56Z","lastTransitionTime":"2026-03-20T13:25:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.053158 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.053207 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.053219 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.053238 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.053251 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.156522 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.156585 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.156594 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.156608 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.156618 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.259081 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.259119 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.259128 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.259383 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.259421 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.362837 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.362892 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.362905 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.362925 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.362938 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.467026 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.467075 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.467088 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.467114 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.467132 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.570189 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.570258 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.570276 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.570301 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.570322 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.672915 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.672947 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.672957 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.672973 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.672983 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.775447 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.775485 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.775495 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.775509 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.775516 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.877851 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.877885 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.877894 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.877910 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.877920 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.980338 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.980406 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.980419 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.980441 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:57 crc kubenswrapper[4849]: I0320 13:25:57.980459 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:57Z","lastTransitionTime":"2026-03-20T13:25:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.035414 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.035430 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.035567 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.035570 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:58 crc kubenswrapper[4849]: E0320 13:25:58.035663 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:58 crc kubenswrapper[4849]: E0320 13:25:58.035876 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:25:58 crc kubenswrapper[4849]: E0320 13:25:58.035969 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:58 crc kubenswrapper[4849]: E0320 13:25:58.035992 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.084042 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.084132 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.084156 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.084187 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.084211 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.187018 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.187109 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.187128 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.187159 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.187177 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.290266 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.290317 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.290328 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.290352 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.290365 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.393865 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.393914 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.393949 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.393969 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.393983 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.496629 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.496698 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.496714 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.496732 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.496745 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.600114 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.600172 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.600183 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.600208 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.600221 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.703354 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.703389 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.703397 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.703412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.703422 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.806563 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.806620 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.806632 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.806653 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.806669 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.910289 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.910345 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.910358 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.910379 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:58 crc kubenswrapper[4849]: I0320 13:25:58.910392 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:58Z","lastTransitionTime":"2026-03-20T13:25:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.012412 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.012489 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.012512 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.012540 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.012565 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.056544 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.115490 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.115523 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.115532 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.115548 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.115557 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.218394 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.218439 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.218448 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.218468 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.218477 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.320996 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.321049 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.321064 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.321115 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.321132 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.423962 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.424046 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.424063 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.424080 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.424091 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.526435 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.526469 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.526478 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.526491 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.526500 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.629470 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.629510 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.629519 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.629533 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.629544 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.731480 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.732036 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.732091 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.732119 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.732141 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.835545 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.835607 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.835629 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.835658 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.835682 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.938072 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.938136 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.938146 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.938158 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:59 crc kubenswrapper[4849]: I0320 13:25:59.938165 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:59Z","lastTransitionTime":"2026-03-20T13:25:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.034866 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.034905 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.034920 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.034924 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:00 crc kubenswrapper[4849]: E0320 13:26:00.034983 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:00 crc kubenswrapper[4849]: E0320 13:26:00.035093 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:00 crc kubenswrapper[4849]: E0320 13:26:00.035198 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:00 crc kubenswrapper[4849]: E0320 13:26:00.035369 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.040152 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.040212 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.040224 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.040237 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.040248 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.142957 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.143029 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.143046 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.143068 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.143083 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.245753 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.245835 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.245851 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.245867 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.245878 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.347727 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.347772 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.347781 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.347796 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.347806 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.450168 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.450206 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.450217 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.450231 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.450241 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.552979 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.553030 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.553045 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.553064 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.553080 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.655264 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.655331 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.655350 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.655364 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.655374 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.757579 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.757617 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.757627 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.757644 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.757655 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.860013 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.860056 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.860064 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.860083 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.860098 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.962671 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.962707 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.962715 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.962730 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:00 crc kubenswrapper[4849]: I0320 13:26:00.962740 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:00Z","lastTransitionTime":"2026-03-20T13:26:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.047289 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.056937 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.064689 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.064720 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.064729 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.064746 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.064755 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.068394 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.079538 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.092939 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.104894 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.114344 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.125849 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.143770 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.152072 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.166719 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.166915 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.167066 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.167257 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.167353 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.168377 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.183535 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.194356 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.205746 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.214764 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.216960 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.217063 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.217125 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.217190 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.217245 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: E0320 13:26:01.227424 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.230482 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.230516 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.230526 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.230541 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.230552 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.235994 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:52Z\\\",\\\"message\\\":\\\"lient/informers/externalversions/factory.go:117\\\\nI0320 13:25:52.009834 6872 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009969 6872 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009979 6872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:25:52.010060 6872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:52.010169 6872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:25:52.010172 6872 factory.go:656] Stopping watch factory\\\\nI0320 13:25:52.010190 6872 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:25:52.012730 6872 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:25:52.013013 6872 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:25:52.013064 6872 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:52.013096 6872 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:25:52.013204 6872 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: E0320 13:26:01.242115 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.245746 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.245791 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.245800 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.245831 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.245845 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: E0320 13:26:01.257771 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.261113 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.261163 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.261188 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.261211 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.261229 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: E0320 13:26:01.272138 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.275260 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.275300 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.275310 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.275327 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.275337 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: E0320 13:26:01.285914 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:01 crc kubenswrapper[4849]: E0320 13:26:01.286050 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.287680 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.287721 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.287734 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.287753 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.287763 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.390026 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.390061 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.390069 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.390081 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.390090 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.492679 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.492718 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.492727 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.492740 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.492751 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.595361 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.595402 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.595413 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.595428 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.595440 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.697930 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.698224 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.698300 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.698375 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.698464 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.800810 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.801139 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.801283 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.801421 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.801550 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.903932 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.903984 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.903996 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.904014 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:01 crc kubenswrapper[4849]: I0320 13:26:01.904030 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:01Z","lastTransitionTime":"2026-03-20T13:26:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.005797 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.005862 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.005875 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.005889 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.005901 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.034912 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.035006 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.035060 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:02 crc kubenswrapper[4849]: E0320 13:26:02.035306 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.035381 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:02 crc kubenswrapper[4849]: E0320 13:26:02.035531 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:02 crc kubenswrapper[4849]: E0320 13:26:02.035727 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:02 crc kubenswrapper[4849]: E0320 13:26:02.035904 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.109851 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.109922 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.109948 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.109984 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.110015 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.213319 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.213359 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.213370 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.213387 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.213397 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.317324 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.317388 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.317409 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.317435 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.317453 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.420712 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.420778 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.420790 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.420807 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.420844 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.523549 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.523586 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.523595 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.523607 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.523616 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.626453 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.626511 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.626530 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.626556 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.626573 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.728505 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.728566 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.728576 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.728588 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.728597 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.831129 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.831167 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.831175 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.831187 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.831196 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.934045 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.934083 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.934094 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.934111 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:02 crc kubenswrapper[4849]: I0320 13:26:02.934123 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:02Z","lastTransitionTime":"2026-03-20T13:26:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.036428 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.036467 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.036476 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.036494 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.036504 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.139229 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.139315 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.139330 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.139356 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.139372 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.243314 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.243380 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.243391 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.243409 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.243421 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.346520 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.346599 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.346619 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.346651 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.346676 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.449860 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.449940 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.449961 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.449988 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.450005 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.553469 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.553529 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.553594 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.553634 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.553653 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.656449 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.656508 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.656525 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.656549 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.656566 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.758694 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.758737 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.758748 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.758762 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.758775 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.862370 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.862458 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.862484 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.862518 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.862540 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.965658 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.965756 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.965775 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.965798 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:03 crc kubenswrapper[4849]: I0320 13:26:03.965814 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:03Z","lastTransitionTime":"2026-03-20T13:26:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.035404 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.035457 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:04 crc kubenswrapper[4849]: E0320 13:26:04.035582 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.035606 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.035655 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:04 crc kubenswrapper[4849]: E0320 13:26:04.035813 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:04 crc kubenswrapper[4849]: E0320 13:26:04.035966 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:04 crc kubenswrapper[4849]: E0320 13:26:04.036080 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.067857 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.067900 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.067912 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.067943 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.067955 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.170269 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.170300 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.170308 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.170321 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.170332 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.272990 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.273039 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.273050 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.273073 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.273086 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.375962 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.376024 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.376042 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.376068 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.376085 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.478289 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.478316 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.478325 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.478337 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.478346 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.580635 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.580884 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.580984 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.581065 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.581127 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.682982 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.683014 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.683023 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.683035 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.683044 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.785557 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.785640 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.785662 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.785693 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.785712 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.888585 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.888632 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.888644 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.888661 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.888673 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.991317 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.992153 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.992222 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.992276 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:04 crc kubenswrapper[4849]: I0320 13:26:04.992355 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:04Z","lastTransitionTime":"2026-03-20T13:26:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.095769 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.095813 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.095837 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.095852 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.095863 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.198057 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.198102 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.198111 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.198131 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.198148 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.301533 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.301604 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.301620 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.301646 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.301662 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.404678 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.404726 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.404745 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.404767 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.404782 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.509102 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.509145 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.509158 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.509182 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.509196 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.613301 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.613370 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.613392 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.613426 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.613450 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.716543 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.716588 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.716597 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.716615 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.716626 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.820486 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.820924 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.820962 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.820985 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.820999 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.925029 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.925098 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.925118 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.925145 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:05 crc kubenswrapper[4849]: I0320 13:26:05.925174 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:05Z","lastTransitionTime":"2026-03-20T13:26:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.028711 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.028850 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.028871 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.028899 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.028917 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.035266 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.035409 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.035414 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.035262 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:06 crc kubenswrapper[4849]: E0320 13:26:06.035782 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:06 crc kubenswrapper[4849]: E0320 13:26:06.035621 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:06 crc kubenswrapper[4849]: E0320 13:26:06.036072 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:06 crc kubenswrapper[4849]: E0320 13:26:06.036403 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.056147 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.132420 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.132521 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.132549 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.132587 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.132609 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.235814 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.235896 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.235910 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.235933 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.235948 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.338178 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.338217 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.338225 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.338237 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.338246 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.441387 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.441460 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.441482 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.441510 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.441535 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.545486 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.545559 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.545580 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.545611 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.545633 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.648644 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.648722 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.648741 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.648780 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.648806 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.752132 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.752201 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.752221 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.752251 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.752273 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.855679 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.855724 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.855737 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.855753 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.855768 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.958598 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.958668 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.958692 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.958725 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:06 crc kubenswrapper[4849]: I0320 13:26:06.958750 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:06Z","lastTransitionTime":"2026-03-20T13:26:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.036490 4849 scope.go:117] "RemoveContainer" containerID="55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.061655 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.061712 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.061728 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.061752 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.061768 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.169554 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.169682 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.169951 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.169986 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.170001 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.272433 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.272465 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.272474 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.272488 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.272499 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.377188 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.377286 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.377309 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.377344 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.377365 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.480589 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.480638 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.480652 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.480670 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.480682 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.583781 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.583859 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.583871 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.583891 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.583903 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.681629 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/1.log" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.685633 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.685916 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.685963 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.685976 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.685994 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.686005 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.686308 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.699719 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.712129 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.726341 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.739125 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.748614 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.766938 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.783041 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.788231 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.788295 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.788311 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.788336 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.788353 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.798924 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.813104 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.826117 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.837998 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.849909 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.867987 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.880463 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.890262 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.890317 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.890327 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.890341 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.890349 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.897618 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.909402 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.927645 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:52Z\\\",\\\"message\\\":\\\"lient/informers/externalversions/factory.go:117\\\\nI0320 13:25:52.009834 6872 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009969 6872 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009979 6872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:25:52.010060 6872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:52.010169 6872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:25:52.010172 6872 factory.go:656] Stopping watch factory\\\\nI0320 13:25:52.010190 6872 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:25:52.012730 6872 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:25:52.013013 6872 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:25:52.013064 6872 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:52.013096 6872 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:25:52.013204 6872 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:07Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.992878 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.993189 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.993307 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.993401 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:07 crc kubenswrapper[4849]: I0320 13:26:07.993483 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:07Z","lastTransitionTime":"2026-03-20T13:26:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.035480 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.035483 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.035496 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.036060 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:08 crc kubenswrapper[4849]: E0320 13:26:08.036160 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:08 crc kubenswrapper[4849]: E0320 13:26:08.036212 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:08 crc kubenswrapper[4849]: E0320 13:26:08.036295 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:08 crc kubenswrapper[4849]: E0320 13:26:08.036370 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.036717 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.095110 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.095145 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.095154 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.095169 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.095178 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.197917 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.197959 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.197968 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.197983 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.197994 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.300208 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.300260 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.300268 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.300304 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.300330 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.403000 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.403040 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.403052 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.403086 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.403095 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.505946 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.505975 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.505983 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.505997 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.506006 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.608774 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.609054 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.609133 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.609221 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.609307 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.690535 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.692744 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.693115 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.694502 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/2.log" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.695091 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/1.log" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.698682 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1" exitCode=1 Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.698754 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.698845 4849 scope.go:117] "RemoveContainer" containerID="55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.700141 4849 scope.go:117] "RemoveContainer" containerID="61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1" Mar 20 13:26:08 crc kubenswrapper[4849]: E0320 13:26:08.700408 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.711165 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.711454 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.711463 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.711477 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.711487 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.714541 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.726656 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.739995 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.749879 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.759510 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.778218 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.791114 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.801292 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.813868 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.813895 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.813905 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.813919 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.813929 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.814577 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.832422 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:52Z\\\",\\\"message\\\":\\\"lient/informers/externalversions/factory.go:117\\\\nI0320 13:25:52.009834 6872 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009969 6872 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009979 6872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:25:52.010060 6872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:52.010169 6872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:25:52.010172 6872 factory.go:656] Stopping watch factory\\\\nI0320 13:25:52.010190 6872 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:25:52.012730 6872 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:25:52.013013 6872 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:25:52.013064 6872 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:52.013096 6872 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:25:52.013204 6872 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.844801 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.856074 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.865782 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.878366 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.889638 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.901426 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.910299 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.916079 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.916109 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.916121 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.916138 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.916148 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:08Z","lastTransitionTime":"2026-03-20T13:26:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.922048 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.932048 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.942871 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.954176 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.967759 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.979131 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:08 crc kubenswrapper[4849]: I0320 13:26:08.988105 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.006105 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.018757 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.018792 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.018806 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.018840 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.018853 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.018879 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.033089 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.045226 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.058563 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.073542 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.086602 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.098737 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.109268 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.121044 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.121092 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.121103 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.121120 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.121131 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.125697 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55b895cd64dfa2596a0ca303821c8e9c89d329b294f215bf3cb68f81ad270bb8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:52Z\\\",\\\"message\\\":\\\"lient/informers/externalversions/factory.go:117\\\\nI0320 13:25:52.009834 6872 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009969 6872 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:25:52.009979 6872 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:25:52.010060 6872 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:25:52.010169 6872 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:25:52.010172 6872 factory.go:656] Stopping watch factory\\\\nI0320 13:25:52.010190 6872 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:25:52.012730 6872 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:25:52.013013 6872 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:25:52.013064 6872 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:25:52.013096 6872 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:25:52.013204 6872 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.223835 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.223883 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.223896 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.223915 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.223930 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.326163 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.326202 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.326209 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.326224 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.326234 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.428586 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.428636 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.428648 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.428663 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.428675 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.531370 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.531413 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.531424 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.531439 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.531449 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.633754 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.633802 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.633813 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.633855 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.633867 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.703392 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/2.log" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.710318 4849 scope.go:117] "RemoveContainer" containerID="61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1" Mar 20 13:26:09 crc kubenswrapper[4849]: E0320 13:26:09.710466 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.726017 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.738204 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.738228 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.738237 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.738249 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.738258 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.738795 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.764779 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.784992 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.804544 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.817306 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.828417 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.836525 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.840070 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.840115 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.840128 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.840149 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.840161 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.847025 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.858525 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.874534 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.887463 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.903309 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.914012 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.922595 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.931274 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.941456 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.941487 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.941497 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.941512 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.941524 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:09Z","lastTransitionTime":"2026-03-20T13:26:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:09 crc kubenswrapper[4849]: I0320 13:26:09.947553 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.035597 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.035645 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.035671 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.035742 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.035608 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.035831 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.035918 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.035971 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.044112 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.044165 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.044178 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.044192 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.044203 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.146554 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.146591 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.146602 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.146617 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.146629 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.248671 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.248709 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.248717 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.248729 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.248739 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.350858 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.350894 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.350905 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.350919 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.350928 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.453246 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.453295 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.453309 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.453330 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.453344 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.555274 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.555297 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.555305 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.555318 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.555327 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.657764 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.657794 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.657804 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.657820 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.657852 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.760375 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.760417 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.760427 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.760447 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.760458 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.829667 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.829802 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.829881 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.829910 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.829876669 +0000 UTC m=+152.507600064 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.829998 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.830037 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830003 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830122 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830176 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830193 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830205 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830145 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830296 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830054 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830254 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.83022686 +0000 UTC m=+152.507950285 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830361 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.830351233 +0000 UTC m=+152.508074738 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830376 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.830368754 +0000 UTC m=+152.508092259 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.830390 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.830382954 +0000 UTC m=+152.508106489 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.863638 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.863705 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.863727 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.863757 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.863780 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.930906 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.931052 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: E0320 13:26:10.931105 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.931091893 +0000 UTC m=+152.608815288 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.966849 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.966886 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.966897 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.966912 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:10 crc kubenswrapper[4849]: I0320 13:26:10.966921 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:10Z","lastTransitionTime":"2026-03-20T13:26:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.051691 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.065408 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.067522 4849 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.076894 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.088099 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.098167 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.108614 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.132521 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.136197 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.151991 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.164091 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.175082 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.185721 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.197555 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.210550 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.224265 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.239094 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.250143 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.269975 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.523218 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.523501 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.523511 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.523525 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.523536 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:11Z","lastTransitionTime":"2026-03-20T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.536162 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.540388 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.540421 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.540430 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.540443 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.540453 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:11Z","lastTransitionTime":"2026-03-20T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.554475 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.557542 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.557572 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.557580 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.557593 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.557601 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:11Z","lastTransitionTime":"2026-03-20T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.569968 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.574084 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.574130 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.574143 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.574160 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.574173 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:11Z","lastTransitionTime":"2026-03-20T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.587069 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.590204 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.590234 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.590243 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.590258 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:11 crc kubenswrapper[4849]: I0320 13:26:11.590272 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:11Z","lastTransitionTime":"2026-03-20T13:26:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.601152 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:11 crc kubenswrapper[4849]: E0320 13:26:11.601415 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:12 crc kubenswrapper[4849]: I0320 13:26:12.034848 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:12 crc kubenswrapper[4849]: E0320 13:26:12.035182 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:12 crc kubenswrapper[4849]: I0320 13:26:12.034937 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:12 crc kubenswrapper[4849]: E0320 13:26:12.035366 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:12 crc kubenswrapper[4849]: I0320 13:26:12.034904 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:12 crc kubenswrapper[4849]: E0320 13:26:12.035522 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:12 crc kubenswrapper[4849]: I0320 13:26:12.034944 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:12 crc kubenswrapper[4849]: E0320 13:26:12.035695 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:14 crc kubenswrapper[4849]: I0320 13:26:14.035407 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:14 crc kubenswrapper[4849]: I0320 13:26:14.035471 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:14 crc kubenswrapper[4849]: I0320 13:26:14.035407 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:14 crc kubenswrapper[4849]: I0320 13:26:14.035438 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:14 crc kubenswrapper[4849]: E0320 13:26:14.035596 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:14 crc kubenswrapper[4849]: E0320 13:26:14.035740 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:14 crc kubenswrapper[4849]: E0320 13:26:14.035761 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:14 crc kubenswrapper[4849]: E0320 13:26:14.035802 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:16 crc kubenswrapper[4849]: I0320 13:26:16.034887 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:16 crc kubenswrapper[4849]: I0320 13:26:16.034933 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:16 crc kubenswrapper[4849]: I0320 13:26:16.034917 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:16 crc kubenswrapper[4849]: I0320 13:26:16.034888 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:16 crc kubenswrapper[4849]: E0320 13:26:16.035034 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:16 crc kubenswrapper[4849]: E0320 13:26:16.035091 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:16 crc kubenswrapper[4849]: E0320 13:26:16.035218 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:16 crc kubenswrapper[4849]: E0320 13:26:16.035304 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:16 crc kubenswrapper[4849]: E0320 13:26:16.133898 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:18 crc kubenswrapper[4849]: I0320 13:26:18.035257 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:18 crc kubenswrapper[4849]: E0320 13:26:18.036121 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:18 crc kubenswrapper[4849]: I0320 13:26:18.035367 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:18 crc kubenswrapper[4849]: E0320 13:26:18.036298 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:18 crc kubenswrapper[4849]: I0320 13:26:18.035427 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:18 crc kubenswrapper[4849]: E0320 13:26:18.036406 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:18 crc kubenswrapper[4849]: I0320 13:26:18.035268 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:18 crc kubenswrapper[4849]: E0320 13:26:18.036514 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:20 crc kubenswrapper[4849]: I0320 13:26:20.035552 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:20 crc kubenswrapper[4849]: I0320 13:26:20.035634 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:20 crc kubenswrapper[4849]: E0320 13:26:20.035749 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:20 crc kubenswrapper[4849]: I0320 13:26:20.035802 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:20 crc kubenswrapper[4849]: E0320 13:26:20.035959 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:20 crc kubenswrapper[4849]: E0320 13:26:20.035879 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:20 crc kubenswrapper[4849]: I0320 13:26:20.036212 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:20 crc kubenswrapper[4849]: E0320 13:26:20.036367 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.056136 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.073030 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.088155 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.103700 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.120030 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.133293 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.136376 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.153323 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.166229 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.180147 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.190594 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.200260 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.209879 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.226007 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.240367 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.249203 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.259640 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.269972 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.361108 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.373708 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.384053 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.394046 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.406084 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.422358 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.438210 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.451188 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.469908 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.480878 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.491597 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.514726 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.530577 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.544976 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.559697 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.573797 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.591797 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.604046 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.723479 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.723517 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.723529 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.723544 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.723553 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:21Z","lastTransitionTime":"2026-03-20T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.735629 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.739028 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.739080 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.739092 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.739110 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.739121 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:21Z","lastTransitionTime":"2026-03-20T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.750566 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.754224 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.754251 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.754260 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.754273 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.754284 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:21Z","lastTransitionTime":"2026-03-20T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.766627 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.769715 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.769744 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.769757 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.769772 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.769783 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:21Z","lastTransitionTime":"2026-03-20T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.784191 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.787198 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.787228 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.787236 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.787250 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:21 crc kubenswrapper[4849]: I0320 13:26:21.787260 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:21Z","lastTransitionTime":"2026-03-20T13:26:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.798276 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:21 crc kubenswrapper[4849]: E0320 13:26:21.798415 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:22 crc kubenswrapper[4849]: I0320 13:26:22.035798 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:22 crc kubenswrapper[4849]: I0320 13:26:22.035869 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:22 crc kubenswrapper[4849]: I0320 13:26:22.036028 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:22 crc kubenswrapper[4849]: I0320 13:26:22.035942 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:22 crc kubenswrapper[4849]: E0320 13:26:22.036296 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:22 crc kubenswrapper[4849]: E0320 13:26:22.036482 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:22 crc kubenswrapper[4849]: E0320 13:26:22.036697 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:22 crc kubenswrapper[4849]: E0320 13:26:22.036778 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:22 crc kubenswrapper[4849]: I0320 13:26:22.046866 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:26:24 crc kubenswrapper[4849]: I0320 13:26:24.035475 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:24 crc kubenswrapper[4849]: I0320 13:26:24.035524 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:24 crc kubenswrapper[4849]: I0320 13:26:24.035495 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:24 crc kubenswrapper[4849]: I0320 13:26:24.035492 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:24 crc kubenswrapper[4849]: E0320 13:26:24.035622 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:24 crc kubenswrapper[4849]: E0320 13:26:24.035702 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:24 crc kubenswrapper[4849]: E0320 13:26:24.035894 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:24 crc kubenswrapper[4849]: E0320 13:26:24.035988 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:25 crc kubenswrapper[4849]: I0320 13:26:25.036553 4849 scope.go:117] "RemoveContainer" containerID="61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1" Mar 20 13:26:25 crc kubenswrapper[4849]: E0320 13:26:25.036732 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.034724 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.034780 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.034750 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:26 crc kubenswrapper[4849]: E0320 13:26:26.034888 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.034912 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:26 crc kubenswrapper[4849]: E0320 13:26:26.034975 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:26 crc kubenswrapper[4849]: E0320 13:26:26.035028 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:26 crc kubenswrapper[4849]: E0320 13:26:26.035165 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:26 crc kubenswrapper[4849]: E0320 13:26:26.137865 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.761881 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/0.log" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.761933 4849 generic.go:334] "Generic (PLEG): container finished" podID="606dc5eb-f89f-41cb-8aa2-f55fcab8f04d" containerID="26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd" exitCode=1 Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.761975 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerDied","Data":"26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd"} Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.762466 4849 scope.go:117] "RemoveContainer" containerID="26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.773711 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.784230 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.794755 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.806040 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.817476 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.830394 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.841111 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.852338 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.861883 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.871357 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.889375 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.903354 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.914552 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.927238 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.937668 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.955216 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.966708 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:26 crc kubenswrapper[4849]: I0320 13:26:26.978142 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.768223 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/0.log" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.768286 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerStarted","Data":"d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195"} Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.783011 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.799478 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.815783 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.844424 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.859175 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.874864 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.886458 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.897397 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.908404 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.920199 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.931186 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.942898 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.954080 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.966991 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.977300 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:27 crc kubenswrapper[4849]: I0320 13:26:27.985618 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:28 crc kubenswrapper[4849]: I0320 13:26:28.004843 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:28Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:28 crc kubenswrapper[4849]: I0320 13:26:28.016596 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:28Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:28 crc kubenswrapper[4849]: I0320 13:26:28.034681 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:28 crc kubenswrapper[4849]: I0320 13:26:28.034763 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:28 crc kubenswrapper[4849]: I0320 13:26:28.034867 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:28 crc kubenswrapper[4849]: E0320 13:26:28.034927 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:28 crc kubenswrapper[4849]: E0320 13:26:28.035007 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:28 crc kubenswrapper[4849]: E0320 13:26:28.035103 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:28 crc kubenswrapper[4849]: I0320 13:26:28.035197 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:28 crc kubenswrapper[4849]: E0320 13:26:28.035394 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:30 crc kubenswrapper[4849]: I0320 13:26:30.035482 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:30 crc kubenswrapper[4849]: I0320 13:26:30.035593 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:30 crc kubenswrapper[4849]: E0320 13:26:30.035638 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:30 crc kubenswrapper[4849]: I0320 13:26:30.035765 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:30 crc kubenswrapper[4849]: E0320 13:26:30.035847 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:30 crc kubenswrapper[4849]: E0320 13:26:30.036007 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:30 crc kubenswrapper[4849]: I0320 13:26:30.036117 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:30 crc kubenswrapper[4849]: E0320 13:26:30.036206 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.048651 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.062207 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.074447 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.088927 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.103746 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.120048 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.136126 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: E0320 13:26:31.138347 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.160943 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.198456 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.220857 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.247293 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.261519 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.275078 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.291001 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.305578 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.325577 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.341664 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:31 crc kubenswrapper[4849]: I0320 13:26:31.359250 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.035678 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.035699 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.035801 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.035932 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.035969 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.036013 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.036321 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.036463 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.128655 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.128699 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.128710 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.128724 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.128735 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:32Z","lastTransitionTime":"2026-03-20T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.142731 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.147152 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.147194 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.147206 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.147223 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.147234 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:32Z","lastTransitionTime":"2026-03-20T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.157864 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.160941 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.160993 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.161008 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.161027 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.161043 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:32Z","lastTransitionTime":"2026-03-20T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.175218 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.178873 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.178907 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.178914 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.178926 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.178934 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:32Z","lastTransitionTime":"2026-03-20T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.191919 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.195440 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.195475 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.195487 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.195501 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:32 crc kubenswrapper[4849]: I0320 13:26:32.195511 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:32Z","lastTransitionTime":"2026-03-20T13:26:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.209805 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:32 crc kubenswrapper[4849]: E0320 13:26:32.209984 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:34 crc kubenswrapper[4849]: I0320 13:26:34.034804 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:34 crc kubenswrapper[4849]: I0320 13:26:34.034871 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:34 crc kubenswrapper[4849]: E0320 13:26:34.034971 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:34 crc kubenswrapper[4849]: I0320 13:26:34.034889 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:34 crc kubenswrapper[4849]: I0320 13:26:34.034984 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:34 crc kubenswrapper[4849]: E0320 13:26:34.035065 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:34 crc kubenswrapper[4849]: E0320 13:26:34.035304 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:34 crc kubenswrapper[4849]: E0320 13:26:34.035921 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.035373 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.035453 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.035526 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:36 crc kubenswrapper[4849]: E0320 13:26:36.035563 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:36 crc kubenswrapper[4849]: E0320 13:26:36.035626 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.035644 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:36 crc kubenswrapper[4849]: E0320 13:26:36.036030 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:36 crc kubenswrapper[4849]: E0320 13:26:36.036078 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.036367 4849 scope.go:117] "RemoveContainer" containerID="61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1" Mar 20 13:26:36 crc kubenswrapper[4849]: E0320 13:26:36.139439 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.801459 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/2.log" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.804505 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.805015 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.822235 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.834182 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.848615 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.860830 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.872332 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.891890 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.903741 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.914640 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.926633 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.937530 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.948340 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.956985 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.967846 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.984696 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:36 crc kubenswrapper[4849]: I0320 13:26:36.997009 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.011290 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.025309 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.044092 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.809494 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/3.log" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.810310 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/2.log" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.813208 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" exitCode=1 Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.813267 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.813314 4849 scope.go:117] "RemoveContainer" containerID="61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.814350 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:26:37 crc kubenswrapper[4849]: E0320 13:26:37.814840 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.828414 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.847518 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.868906 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.890757 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.909273 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.928181 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61f202c971a0b6cd59802cbb5c0fc0d23baf7aeb741eff6ae00285901b4d83e1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:07Z\\\",\\\"message\\\":\\\":190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:26:07.886071 7069 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:26:07.886078 7069 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:26:07.886108 7069 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:07.886112 7069 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:26:07.886126 7069 factory.go:656] Stopping watch factory\\\\nI0320 13:26:07.886144 7069 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:26:07.886153 7069 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:26:07.886159 7069 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:26:07.886165 7069 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:26:07.886171 7069 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:26:07.886177 7069 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:07.886607 7069 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 13:26:07.886787 7069 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 13:26:07.887416 7069 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:07.887476 7069 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:07.887550 7069 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:37Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284268 7424 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284404 7424 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284716 7424 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284976 7424 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:26:37.285387 7424 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:37.285559 7424 factory.go:656] Stopping watch factory\\\\nI0320 13:26:37.285603 7424 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:37.307146 7424 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:26:37.307180 7424 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:26:37.307247 7424 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:37.307283 7424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:37.307401 7424 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.942473 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.956589 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.970655 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:37 crc kubenswrapper[4849]: I0320 13:26:37.983560 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:37Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.007112 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.022164 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.034285 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.035353 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.035400 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.035361 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:38 crc kubenswrapper[4849]: E0320 13:26:38.035468 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.035667 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:38 crc kubenswrapper[4849]: E0320 13:26:38.035550 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:38 crc kubenswrapper[4849]: E0320 13:26:38.035739 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:38 crc kubenswrapper[4849]: E0320 13:26:38.035794 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.046337 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.057352 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.069854 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.080696 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.090878 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.818528 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/3.log" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.822441 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:26:38 crc kubenswrapper[4849]: E0320 13:26:38.822852 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.835269 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.847182 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.859679 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.875948 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:37Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284268 7424 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284404 7424 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284716 7424 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284976 7424 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:26:37.285387 7424 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:37.285559 7424 factory.go:656] Stopping watch factory\\\\nI0320 13:26:37.285603 7424 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:37.307146 7424 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:26:37.307180 7424 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:26:37.307247 7424 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:37.307283 7424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:37.307401 7424 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.888504 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.898858 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.910278 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.921875 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.933966 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.943368 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.953122 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.971800 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.983623 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:38 crc kubenswrapper[4849]: I0320 13:26:38.993468 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:38Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:39 crc kubenswrapper[4849]: I0320 13:26:39.004437 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:39 crc kubenswrapper[4849]: I0320 13:26:39.015045 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:39 crc kubenswrapper[4849]: I0320 13:26:39.025676 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:39 crc kubenswrapper[4849]: I0320 13:26:39.040093 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:40 crc kubenswrapper[4849]: I0320 13:26:40.035669 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:40 crc kubenswrapper[4849]: I0320 13:26:40.035669 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:40 crc kubenswrapper[4849]: I0320 13:26:40.035686 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:40 crc kubenswrapper[4849]: I0320 13:26:40.035717 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:40 crc kubenswrapper[4849]: E0320 13:26:40.036216 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:40 crc kubenswrapper[4849]: E0320 13:26:40.036420 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:40 crc kubenswrapper[4849]: E0320 13:26:40.036464 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:40 crc kubenswrapper[4849]: E0320 13:26:40.036516 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.046623 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.057053 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.066413 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.077025 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.090977 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.104296 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.115624 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.127624 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.138151 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: E0320 13:26:41.140314 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.150276 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.169724 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.182215 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.194130 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.207041 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.218697 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.236028 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:37Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284268 7424 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284404 7424 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284716 7424 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284976 7424 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:26:37.285387 7424 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:37.285559 7424 factory.go:656] Stopping watch factory\\\\nI0320 13:26:37.285603 7424 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:37.307146 7424 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:26:37.307180 7424 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:26:37.307247 7424 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:37.307283 7424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:37.307401 7424 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.247931 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:41 crc kubenswrapper[4849]: I0320 13:26:41.259745 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.034932 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.034953 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.035019 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.035161 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.035247 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.035274 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.035442 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.035573 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.417007 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.417078 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.417098 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.417129 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.417147 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:42Z","lastTransitionTime":"2026-03-20T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.436770 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.441627 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.441670 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.441680 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.441695 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.441705 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:42Z","lastTransitionTime":"2026-03-20T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.458774 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.463950 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.463988 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.463999 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.464015 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.464027 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:42Z","lastTransitionTime":"2026-03-20T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.481245 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.485776 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.486484 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.486512 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.486538 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.486554 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:42Z","lastTransitionTime":"2026-03-20T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.499460 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.502794 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.502874 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.502893 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.502916 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.502929 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:42Z","lastTransitionTime":"2026-03-20T13:26:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.520074 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.520198 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.843099 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.843198 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843225 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:46.843198418 +0000 UTC m=+216.520921833 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843263 4849 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.843264 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843306 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:27:46.843293911 +0000 UTC m=+216.521017306 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.843334 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.843353 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843396 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843415 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843468 4849 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843504 4849 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843511 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:27:46.843499767 +0000 UTC m=+216.521223222 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843470 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843528 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:27:46.843521318 +0000 UTC m=+216.521244713 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843538 4849 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843550 4849 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.843584 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:27:46.843575739 +0000 UTC m=+216.521299174 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:26:42 crc kubenswrapper[4849]: I0320 13:26:42.943959 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.944148 4849 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:26:42 crc kubenswrapper[4849]: E0320 13:26:42.944214 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs podName:8ca35818-87a2-4dac-ad57-310ffe701961 nodeName:}" failed. No retries permitted until 2026-03-20 13:27:46.944200246 +0000 UTC m=+216.621923641 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs") pod "network-metrics-daemon-vm768" (UID: "8ca35818-87a2-4dac-ad57-310ffe701961") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:26:44 crc kubenswrapper[4849]: I0320 13:26:44.035226 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:44 crc kubenswrapper[4849]: E0320 13:26:44.035364 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:44 crc kubenswrapper[4849]: I0320 13:26:44.035372 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:44 crc kubenswrapper[4849]: I0320 13:26:44.035417 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:44 crc kubenswrapper[4849]: E0320 13:26:44.035509 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:44 crc kubenswrapper[4849]: I0320 13:26:44.035540 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:44 crc kubenswrapper[4849]: E0320 13:26:44.035686 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:44 crc kubenswrapper[4849]: E0320 13:26:44.035739 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:46 crc kubenswrapper[4849]: I0320 13:26:46.035416 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:46 crc kubenswrapper[4849]: I0320 13:26:46.035553 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:46 crc kubenswrapper[4849]: I0320 13:26:46.035551 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:46 crc kubenswrapper[4849]: I0320 13:26:46.035597 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:46 crc kubenswrapper[4849]: E0320 13:26:46.035631 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:46 crc kubenswrapper[4849]: E0320 13:26:46.035749 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:46 crc kubenswrapper[4849]: E0320 13:26:46.035783 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:46 crc kubenswrapper[4849]: E0320 13:26:46.035861 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:46 crc kubenswrapper[4849]: E0320 13:26:46.141410 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:48 crc kubenswrapper[4849]: I0320 13:26:48.035208 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:48 crc kubenswrapper[4849]: E0320 13:26:48.035926 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:48 crc kubenswrapper[4849]: I0320 13:26:48.035309 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:48 crc kubenswrapper[4849]: E0320 13:26:48.036008 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:48 crc kubenswrapper[4849]: I0320 13:26:48.035614 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:48 crc kubenswrapper[4849]: E0320 13:26:48.036050 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:48 crc kubenswrapper[4849]: I0320 13:26:48.035297 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:48 crc kubenswrapper[4849]: E0320 13:26:48.036089 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:48 crc kubenswrapper[4849]: I0320 13:26:48.045263 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:26:50 crc kubenswrapper[4849]: I0320 13:26:50.035698 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:50 crc kubenswrapper[4849]: I0320 13:26:50.035698 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:50 crc kubenswrapper[4849]: I0320 13:26:50.035720 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:50 crc kubenswrapper[4849]: I0320 13:26:50.035716 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:50 crc kubenswrapper[4849]: I0320 13:26:50.036508 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:26:50 crc kubenswrapper[4849]: E0320 13:26:50.037134 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:26:50 crc kubenswrapper[4849]: E0320 13:26:50.037134 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:50 crc kubenswrapper[4849]: E0320 13:26:50.037320 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:50 crc kubenswrapper[4849]: E0320 13:26:50.037441 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:50 crc kubenswrapper[4849]: E0320 13:26:50.037488 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.048017 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c81547f-2d8f-44e1-9c1b-19847e15883d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6651f66bbfda244662fbafe6b03ba13712cb012dc7ffff1aa006805a3e29c443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003efb7e4024827de6fbf19b52af32b65ed3498dee9bae9127b5df2d13ea3711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://003efb7e4024827de6fbf19b52af32b65ed3498dee9bae9127b5df2d13ea3711\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.059645 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://636e24fca787746ca2aeddba732f88518a527efa88d66acd9ae0063395feb97a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://43818ed05068096094e308e776773095337d19eeb8851db85dac879c02d58468\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.070112 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w65sz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24edd4aa-ec92-450e-97bc-400c2a0171f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0adc5659a3bf9f51ef9d007489d08b5002fcbd4b58756fe19d9dc350d74c997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6656\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w65sz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.080758 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://add77d507a9a6f05c320837f8dcb3415ebd478d744ac7ddbb4ac8021edeed094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xsdf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pzdl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.092099 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"423277f6-8ff5-40a2-90a2-6e8b09c16b46\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa51229982dc92579060df9f6fc96a4c392484cbb0502ae4f0e30b5024f1a5fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd10fde27337f6b4f797e6b5ecbf7628a08930e9bc03a7a5a40f174dcdbe82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4f6c5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-g2gxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.112811 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"122c5e01-3b4d-42c9-a32b-cef724549b44\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da3899b368bd18eb8ccc6cae2f50a7088f3b224c419e4988efbfe3a1fd5c1a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c68b8bd19036368f6d0f41094f8c10e35181c75e4db6a71be7500afe0ae44b93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d33816208a9c6611cae7eb01088d05f7fe19f1992ff666a3a636ff8064dea6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6cfd4f6c0a712385d409b398aac5c2ac1bda219774718b10cb778802c363356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://999a3278f8dfe780c913c7123064ff2d393547f12021dafcfa596e1a74c480d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05b0e5d8f18b316f8c4f1b396a9f338c1cb58837deb245f0af5f111ea2eadbb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ccdb570efc813b1f0dc80ecc5494f2bdaf002dbc0a23ce05e7114b3642c78a8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6bb006e3cd2cf53687f2662e8499e6ad7f388479c2d7c26bd8668b158f8473e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.126849 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55d45a10-c0f3-44bd-b133-ff8a72a02483\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:25:20Z\\\",\\\"message\\\":\\\".850559 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850623 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI0320 13:25:20.850506 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI0320 13:25:20.850747 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1774013120\\\\\\\\\\\\\\\" (2026-03-20 13:25:19 +0000 UTC to 2026-04-19 13:25:20 +0000 UTC (now=2026-03-20 13:25:20.850719202 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850846 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1758052850/tls.crt::/tmp/serving-cert-1758052850/tls.key\\\\\\\"\\\\nI0320 13:25:20.850922 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1774013120\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1774013120\\\\\\\\\\\\\\\" (2026-03-20 12:25:20 +0000 UTC to 2027-03-20 12:25:20 +0000 UTC (now=2026-03-20 13:25:20.850900247 +0000 UTC))\\\\\\\"\\\\nI0320 13:25:20.850751 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI0320 13:25:20.850465 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI0320 13:25:20.851063 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI0320 13:25:20.850738 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI0320 13:25:20.850946 1 secure_serving.go:213] Serving securely on [::]:17697\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.138474 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbe63cb2-e344-49b6-92dd-ae0338de46e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fc06108fe8dcf1a696a9bb8c68fe922d825578acf0f5a8aaf6c45078584318b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a14393f8dbdb1a9ee455dcf2647fba63efdbbfbece56f255c7b977f14264f0a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a2542f5ddbf6ee41eb9306cddd19192c57f315fe969dbb77a20983dc48a0a3b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6ce5aee28c76ef0d31c5aa22679893ba8d06347572e875c51f3ddf73618d66e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: E0320 13:26:51.144836 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.156047 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fa1f43d54abbdfc799cb41e5dddc408f8f163eabdc109a09079124a0ece1e0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.167291 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.180091 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7nxh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:25:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd\\\\n2026-03-20T13:25:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_eb56d090-42bb-41dd-935b-3f76d4eae1cd to /host/opt/cni/bin/\\\\n2026-03-20T13:25:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:25:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:26:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:26:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kkwjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7nxh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.190768 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vm768" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8ca35818-87a2-4dac-ad57-310ffe701961\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzwzq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vm768\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.200970 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w7shr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d29eb6f-a1dd-4217-8b0f-9bdf8b654b5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://026b6f662a074b5b41d2ddf24796abd8bdbe8702f68b31230fc6c0df5dbb8950\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qjnrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w7shr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.212500 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.225279 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"886ff165-f013-40a8-a6c1-92a16f6b00ae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a16adc6f52f46da8a89d59cb92785469f0cf64ce4a103703a8a02fee9cfca4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2392d7777212f602f97f46816e4c6688e4ca434c3a1060ee53bf6af161b2c6c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98c9ba8345469a5fc1c18010b09d215555180eaf727a3693b6807778493ec13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f849051c3654b09463edb8d0d10871949ad36a4e02fbb7ccd7dd974895d1b849\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6dd4f9f52d6cb49b629858107df7bf867f54394c7861f6037354e940f4011cea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8de657cac72009aef4ed5ff6e2d6927dfe009f3548a493cddc3e4a2cf3864cf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39284deb965b6295a76db482161e7423f48403ac338b7aef28a21a9dfed4aec4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzxv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7cs2t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.236274 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"85b8d485-b847-44d1-bb86-c8feb7c4601c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f68d70c3e57820ee74bdfea060228dcbce10f68255dd105fbdda364212550d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://786f97e194cf82f9d9a2e20d5c9236a20429080284bf40bf66dace168c4f8ea5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:24:13.061015 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:24:13.063616 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:24:13.101118 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:24:13.105604 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:24:41.189584 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:24:41.189682 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1534043ef8d555c4b7bc092dc176ce0276f49553241347d3f749db55f035fce3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663dcbaf82ca5269f0c4846215f4b3ca5ed634b5dc3feca4611c615e54921d3a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:11Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.248080 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.259949 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee39f19a3ef558af599ef670ffa538dd0ddb414f2ac6984079ab68e7066db702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:51 crc kubenswrapper[4849]: I0320 13:26:51.276959 4849 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ba9a25c-6156-4c78-a394-60507829eced\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:26:37Z\\\",\\\"message\\\":\\\"] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284268 7424 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284404 7424 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284716 7424 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:26:37.284976 7424 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:26:37.285387 7424 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:26:37.285559 7424 factory.go:656] Stopping watch factory\\\\nI0320 13:26:37.285603 7424 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:26:37.307146 7424 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0320 13:26:37.307180 7424 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0320 13:26:37.307247 7424 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:26:37.307283 7424 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:26:37.307401 7424 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:26:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:25:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7bh57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:25:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7z7ql\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.035391 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.035421 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.035795 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.035945 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.035481 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.035443 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.036045 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.036101 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.653422 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.653461 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.653470 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.653486 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.653499 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:52Z","lastTransitionTime":"2026-03-20T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.673610 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.677895 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.677944 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.677954 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.677968 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.677977 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:52Z","lastTransitionTime":"2026-03-20T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.695197 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.698648 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.698675 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.698685 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.698696 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.698706 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:52Z","lastTransitionTime":"2026-03-20T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.710375 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.713727 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.713758 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.713767 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.713779 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.713789 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:52Z","lastTransitionTime":"2026-03-20T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.731928 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.736167 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.736196 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.736204 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.736218 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:26:52 crc kubenswrapper[4849]: I0320 13:26:52.736230 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:26:52Z","lastTransitionTime":"2026-03-20T13:26:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.749665 4849 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:26:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c9268129-01d7-4b12-98d7-58087a6062f7\\\",\\\"systemUUID\\\":\\\"5558133e-3d97-4e22-9873-bad3dbc7167b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:26:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:26:52 crc kubenswrapper[4849]: E0320 13:26:52.750092 4849 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:54 crc kubenswrapper[4849]: I0320 13:26:54.035084 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:54 crc kubenswrapper[4849]: I0320 13:26:54.035084 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:54 crc kubenswrapper[4849]: I0320 13:26:54.035094 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:54 crc kubenswrapper[4849]: I0320 13:26:54.035737 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:54 crc kubenswrapper[4849]: E0320 13:26:54.035914 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:54 crc kubenswrapper[4849]: E0320 13:26:54.036016 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:54 crc kubenswrapper[4849]: E0320 13:26:54.036105 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:54 crc kubenswrapper[4849]: E0320 13:26:54.036256 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:56 crc kubenswrapper[4849]: I0320 13:26:56.035646 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:56 crc kubenswrapper[4849]: I0320 13:26:56.035943 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:56 crc kubenswrapper[4849]: E0320 13:26:56.036052 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:56 crc kubenswrapper[4849]: I0320 13:26:56.036093 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:56 crc kubenswrapper[4849]: E0320 13:26:56.036157 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:56 crc kubenswrapper[4849]: E0320 13:26:56.036264 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:56 crc kubenswrapper[4849]: I0320 13:26:56.036302 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:56 crc kubenswrapper[4849]: E0320 13:26:56.036615 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:56 crc kubenswrapper[4849]: E0320 13:26:56.146186 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:58 crc kubenswrapper[4849]: I0320 13:26:58.035356 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:58 crc kubenswrapper[4849]: I0320 13:26:58.035392 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:26:58 crc kubenswrapper[4849]: E0320 13:26:58.035534 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:58 crc kubenswrapper[4849]: I0320 13:26:58.035575 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:58 crc kubenswrapper[4849]: I0320 13:26:58.035598 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:58 crc kubenswrapper[4849]: E0320 13:26:58.035687 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:26:58 crc kubenswrapper[4849]: E0320 13:26:58.035771 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:58 crc kubenswrapper[4849]: E0320 13:26:58.036108 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:00 crc kubenswrapper[4849]: I0320 13:27:00.035697 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:00 crc kubenswrapper[4849]: I0320 13:27:00.035774 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:00 crc kubenswrapper[4849]: I0320 13:27:00.035710 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:00 crc kubenswrapper[4849]: E0320 13:27:00.035857 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:00 crc kubenswrapper[4849]: E0320 13:27:00.035890 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:00 crc kubenswrapper[4849]: E0320 13:27:00.035947 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:00 crc kubenswrapper[4849]: I0320 13:27:00.036496 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:00 crc kubenswrapper[4849]: E0320 13:27:00.036645 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.084213 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w7shr" podStartSLOduration=118.084196095 podStartE2EDuration="1m58.084196095s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.061606961 +0000 UTC m=+170.739330366" watchObservedRunningTime="2026-03-20 13:27:01.084196095 +0000 UTC m=+170.761919490" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.084542 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=62.084535935 podStartE2EDuration="1m2.084535935s" podCreationTimestamp="2026-03-20 13:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.08435188 +0000 UTC m=+170.762075275" watchObservedRunningTime="2026-03-20 13:27:01.084535935 +0000 UTC m=+170.762259330" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.099585 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.09956823 podStartE2EDuration="1m21.09956823s" podCreationTimestamp="2026-03-20 13:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.09920594 +0000 UTC m=+170.776929345" watchObservedRunningTime="2026-03-20 13:27:01.09956823 +0000 UTC m=+170.777291625" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.111387 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.111368586 podStartE2EDuration="39.111368586s" podCreationTimestamp="2026-03-20 13:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.111283464 +0000 UTC m=+170.789006869" watchObservedRunningTime="2026-03-20 13:27:01.111368586 +0000 UTC m=+170.789091981" Mar 20 13:27:01 crc kubenswrapper[4849]: E0320 13:27:01.147620 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.161383 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7nxh7" podStartSLOduration=118.161364428 podStartE2EDuration="1m58.161364428s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.158224161 +0000 UTC m=+170.835947556" watchObservedRunningTime="2026-03-20 13:27:01.161364428 +0000 UTC m=+170.839087823" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.210768 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7cs2t" podStartSLOduration=118.210748463 podStartE2EDuration="1m58.210748463s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.192034856 +0000 UTC m=+170.869758261" watchObservedRunningTime="2026-03-20 13:27:01.210748463 +0000 UTC m=+170.888471858" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.210935 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=55.210930288 podStartE2EDuration="55.210930288s" podCreationTimestamp="2026-03-20 13:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.210338982 +0000 UTC m=+170.888062397" watchObservedRunningTime="2026-03-20 13:27:01.210930288 +0000 UTC m=+170.888653693" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.264535 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.264521359 podStartE2EDuration="13.264521359s" podCreationTimestamp="2026-03-20 13:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.264375475 +0000 UTC m=+170.942098880" watchObservedRunningTime="2026-03-20 13:27:01.264521359 +0000 UTC m=+170.942244754" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.303961 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w65sz" podStartSLOduration=118.303927348 podStartE2EDuration="1m58.303927348s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.293557442 +0000 UTC m=+170.971280837" watchObservedRunningTime="2026-03-20 13:27:01.303927348 +0000 UTC m=+170.981650753" Mar 20 13:27:01 crc kubenswrapper[4849]: I0320 13:27:01.304173 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podStartSLOduration=118.304166585 podStartE2EDuration="1m58.304166585s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.303908458 +0000 UTC m=+170.981631883" watchObservedRunningTime="2026-03-20 13:27:01.304166585 +0000 UTC m=+170.981889990" Mar 20 13:27:02 crc kubenswrapper[4849]: I0320 13:27:02.034899 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:02 crc kubenswrapper[4849]: I0320 13:27:02.034933 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:02 crc kubenswrapper[4849]: E0320 13:27:02.035019 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:02 crc kubenswrapper[4849]: I0320 13:27:02.035035 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:02 crc kubenswrapper[4849]: I0320 13:27:02.035125 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:02 crc kubenswrapper[4849]: E0320 13:27:02.035234 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:02 crc kubenswrapper[4849]: E0320 13:27:02.035512 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:02 crc kubenswrapper[4849]: E0320 13:27:02.035644 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.128784 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.128860 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.128872 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.128895 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.128908 4849 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:27:03Z","lastTransitionTime":"2026-03-20T13:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.168106 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-g2gxw" podStartSLOduration=120.168091413 podStartE2EDuration="2m0.168091413s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:01.318973764 +0000 UTC m=+170.996697199" watchObservedRunningTime="2026-03-20 13:27:03.168091413 +0000 UTC m=+172.845814808" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.168427 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6"] Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.168739 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.170842 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.170939 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.171009 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.171172 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.237523 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9de466ec-fbd4-484a-98b3-1b492456e8d1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.237564 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9de466ec-fbd4-484a-98b3-1b492456e8d1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.237586 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9de466ec-fbd4-484a-98b3-1b492456e8d1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.237606 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de466ec-fbd4-484a-98b3-1b492456e8d1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.237628 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9de466ec-fbd4-484a-98b3-1b492456e8d1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338295 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9de466ec-fbd4-484a-98b3-1b492456e8d1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338583 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9de466ec-fbd4-484a-98b3-1b492456e8d1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338610 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9de466ec-fbd4-484a-98b3-1b492456e8d1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338635 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de466ec-fbd4-484a-98b3-1b492456e8d1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338665 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9de466ec-fbd4-484a-98b3-1b492456e8d1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338664 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9de466ec-fbd4-484a-98b3-1b492456e8d1-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.338876 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9de466ec-fbd4-484a-98b3-1b492456e8d1-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.340028 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9de466ec-fbd4-484a-98b3-1b492456e8d1-service-ca\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.344348 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9de466ec-fbd4-484a-98b3-1b492456e8d1-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.357892 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9de466ec-fbd4-484a-98b3-1b492456e8d1-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-kvzz6\" (UID: \"9de466ec-fbd4-484a-98b3-1b492456e8d1\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.485766 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.639944 4849 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.649593 4849 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.899837 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" event={"ID":"9de466ec-fbd4-484a-98b3-1b492456e8d1","Type":"ContainerStarted","Data":"1ca66004ad76329d6e24ed2a62684984d221dfbae388081640357fd117745b89"} Mar 20 13:27:03 crc kubenswrapper[4849]: I0320 13:27:03.899903 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" event={"ID":"9de466ec-fbd4-484a-98b3-1b492456e8d1","Type":"ContainerStarted","Data":"20a1eb5cd2eba27e17b57ce0122b060f244ade81810927f02d6e9e9f061e63c5"} Mar 20 13:27:04 crc kubenswrapper[4849]: I0320 13:27:04.034938 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:04 crc kubenswrapper[4849]: I0320 13:27:04.035022 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:04 crc kubenswrapper[4849]: E0320 13:27:04.035062 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:04 crc kubenswrapper[4849]: I0320 13:27:04.035099 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:04 crc kubenswrapper[4849]: I0320 13:27:04.035150 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:04 crc kubenswrapper[4849]: E0320 13:27:04.035274 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:04 crc kubenswrapper[4849]: E0320 13:27:04.035362 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:04 crc kubenswrapper[4849]: E0320 13:27:04.035401 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:05 crc kubenswrapper[4849]: I0320 13:27:05.036132 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:27:05 crc kubenswrapper[4849]: E0320 13:27:05.036336 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:27:06 crc kubenswrapper[4849]: I0320 13:27:06.035336 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:06 crc kubenswrapper[4849]: I0320 13:27:06.035420 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:06 crc kubenswrapper[4849]: I0320 13:27:06.035372 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:06 crc kubenswrapper[4849]: E0320 13:27:06.035625 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:06 crc kubenswrapper[4849]: E0320 13:27:06.035690 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:06 crc kubenswrapper[4849]: E0320 13:27:06.035718 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:06 crc kubenswrapper[4849]: I0320 13:27:06.035929 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:06 crc kubenswrapper[4849]: E0320 13:27:06.036132 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:06 crc kubenswrapper[4849]: E0320 13:27:06.148568 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:27:08 crc kubenswrapper[4849]: I0320 13:27:08.035616 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:08 crc kubenswrapper[4849]: I0320 13:27:08.035659 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:08 crc kubenswrapper[4849]: I0320 13:27:08.035714 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:08 crc kubenswrapper[4849]: I0320 13:27:08.036412 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:08 crc kubenswrapper[4849]: E0320 13:27:08.036596 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:08 crc kubenswrapper[4849]: E0320 13:27:08.036709 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:08 crc kubenswrapper[4849]: E0320 13:27:08.036795 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:08 crc kubenswrapper[4849]: E0320 13:27:08.036939 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:10 crc kubenswrapper[4849]: I0320 13:27:10.035784 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:10 crc kubenswrapper[4849]: I0320 13:27:10.035806 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:10 crc kubenswrapper[4849]: E0320 13:27:10.036038 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:10 crc kubenswrapper[4849]: I0320 13:27:10.035859 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:10 crc kubenswrapper[4849]: E0320 13:27:10.036257 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:10 crc kubenswrapper[4849]: I0320 13:27:10.036324 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:10 crc kubenswrapper[4849]: E0320 13:27:10.036387 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:10 crc kubenswrapper[4849]: E0320 13:27:10.036421 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:11 crc kubenswrapper[4849]: E0320 13:27:11.149502 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.034802 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.034835 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.034851 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.034876 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:12 crc kubenswrapper[4849]: E0320 13:27:12.035357 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:12 crc kubenswrapper[4849]: E0320 13:27:12.035698 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:12 crc kubenswrapper[4849]: E0320 13:27:12.035534 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:12 crc kubenswrapper[4849]: E0320 13:27:12.035623 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.933679 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/1.log" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.934300 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/0.log" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.934339 4849 generic.go:334] "Generic (PLEG): container finished" podID="606dc5eb-f89f-41cb-8aa2-f55fcab8f04d" containerID="d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195" exitCode=1 Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.934395 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerDied","Data":"d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195"} Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.934480 4849 scope.go:117] "RemoveContainer" containerID="26930ee6e262e4117db04409f60330941750febed2becf1930678c5ab72bebdd" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.935047 4849 scope.go:117] "RemoveContainer" containerID="d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195" Mar 20 13:27:12 crc kubenswrapper[4849]: E0320 13:27:12.935428 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7nxh7_openshift-multus(606dc5eb-f89f-41cb-8aa2-f55fcab8f04d)\"" pod="openshift-multus/multus-7nxh7" podUID="606dc5eb-f89f-41cb-8aa2-f55fcab8f04d" Mar 20 13:27:12 crc kubenswrapper[4849]: I0320 13:27:12.950766 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-kvzz6" podStartSLOduration=129.950747477 podStartE2EDuration="2m9.950747477s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:03.913951527 +0000 UTC m=+173.591674922" watchObservedRunningTime="2026-03-20 13:27:12.950747477 +0000 UTC m=+182.628470872" Mar 20 13:27:13 crc kubenswrapper[4849]: I0320 13:27:13.938302 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/1.log" Mar 20 13:27:14 crc kubenswrapper[4849]: I0320 13:27:14.034989 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:14 crc kubenswrapper[4849]: I0320 13:27:14.034989 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:14 crc kubenswrapper[4849]: E0320 13:27:14.035122 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:14 crc kubenswrapper[4849]: I0320 13:27:14.035009 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:14 crc kubenswrapper[4849]: E0320 13:27:14.035200 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:14 crc kubenswrapper[4849]: E0320 13:27:14.035287 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:14 crc kubenswrapper[4849]: I0320 13:27:14.035742 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:14 crc kubenswrapper[4849]: E0320 13:27:14.036096 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:16 crc kubenswrapper[4849]: I0320 13:27:16.035873 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:16 crc kubenswrapper[4849]: I0320 13:27:16.035972 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:16 crc kubenswrapper[4849]: E0320 13:27:16.036066 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:16 crc kubenswrapper[4849]: I0320 13:27:16.036286 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:16 crc kubenswrapper[4849]: E0320 13:27:16.036804 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:16 crc kubenswrapper[4849]: E0320 13:27:16.037102 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:16 crc kubenswrapper[4849]: I0320 13:27:16.037175 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:16 crc kubenswrapper[4849]: E0320 13:27:16.037247 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:16 crc kubenswrapper[4849]: I0320 13:27:16.037682 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:27:16 crc kubenswrapper[4849]: E0320 13:27:16.038113 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7z7ql_openshift-ovn-kubernetes(0ba9a25c-6156-4c78-a394-60507829eced)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" Mar 20 13:27:16 crc kubenswrapper[4849]: E0320 13:27:16.152245 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:27:18 crc kubenswrapper[4849]: I0320 13:27:18.035006 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:18 crc kubenswrapper[4849]: I0320 13:27:18.035059 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:18 crc kubenswrapper[4849]: E0320 13:27:18.035147 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:18 crc kubenswrapper[4849]: I0320 13:27:18.035025 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:18 crc kubenswrapper[4849]: E0320 13:27:18.035313 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:18 crc kubenswrapper[4849]: E0320 13:27:18.035544 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:18 crc kubenswrapper[4849]: I0320 13:27:18.035624 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:18 crc kubenswrapper[4849]: E0320 13:27:18.035701 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:20 crc kubenswrapper[4849]: I0320 13:27:20.035196 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:20 crc kubenswrapper[4849]: I0320 13:27:20.035246 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:20 crc kubenswrapper[4849]: I0320 13:27:20.035272 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:20 crc kubenswrapper[4849]: E0320 13:27:20.036189 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:20 crc kubenswrapper[4849]: E0320 13:27:20.036022 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:20 crc kubenswrapper[4849]: E0320 13:27:20.036237 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:20 crc kubenswrapper[4849]: I0320 13:27:20.035274 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:20 crc kubenswrapper[4849]: E0320 13:27:20.036326 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:21 crc kubenswrapper[4849]: E0320 13:27:21.153332 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:27:22 crc kubenswrapper[4849]: I0320 13:27:22.035042 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:22 crc kubenswrapper[4849]: I0320 13:27:22.035591 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:22 crc kubenswrapper[4849]: I0320 13:27:22.035936 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:22 crc kubenswrapper[4849]: E0320 13:27:22.035915 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:22 crc kubenswrapper[4849]: I0320 13:27:22.035973 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:22 crc kubenswrapper[4849]: E0320 13:27:22.036065 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:22 crc kubenswrapper[4849]: E0320 13:27:22.036572 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:22 crc kubenswrapper[4849]: E0320 13:27:22.036691 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:24 crc kubenswrapper[4849]: I0320 13:27:24.035019 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:24 crc kubenswrapper[4849]: I0320 13:27:24.035117 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:24 crc kubenswrapper[4849]: E0320 13:27:24.035173 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:24 crc kubenswrapper[4849]: I0320 13:27:24.035212 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:24 crc kubenswrapper[4849]: E0320 13:27:24.035359 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:24 crc kubenswrapper[4849]: E0320 13:27:24.035416 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:24 crc kubenswrapper[4849]: I0320 13:27:24.035574 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:24 crc kubenswrapper[4849]: E0320 13:27:24.035631 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:26 crc kubenswrapper[4849]: I0320 13:27:26.035439 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:26 crc kubenswrapper[4849]: I0320 13:27:26.035439 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:26 crc kubenswrapper[4849]: I0320 13:27:26.035439 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:26 crc kubenswrapper[4849]: E0320 13:27:26.036385 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:26 crc kubenswrapper[4849]: E0320 13:27:26.036424 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:26 crc kubenswrapper[4849]: E0320 13:27:26.036201 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:26 crc kubenswrapper[4849]: I0320 13:27:26.035476 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:26 crc kubenswrapper[4849]: E0320 13:27:26.036563 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:26 crc kubenswrapper[4849]: E0320 13:27:26.155147 4849 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:27:27 crc kubenswrapper[4849]: I0320 13:27:27.036047 4849 scope.go:117] "RemoveContainer" containerID="d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195" Mar 20 13:27:27 crc kubenswrapper[4849]: I0320 13:27:27.981666 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/1.log" Mar 20 13:27:27 crc kubenswrapper[4849]: I0320 13:27:27.981947 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerStarted","Data":"69558596bddd811517bc5bd607ff8fa66fc36eff63c8acc05cc3b9bc094b4472"} Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.035351 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.035415 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.035427 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.035427 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:28 crc kubenswrapper[4849]: E0320 13:27:28.035517 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:28 crc kubenswrapper[4849]: E0320 13:27:28.035728 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:28 crc kubenswrapper[4849]: E0320 13:27:28.035762 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:28 crc kubenswrapper[4849]: E0320 13:27:28.035865 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.037632 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.883986 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vm768"] Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.986762 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/3.log" Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.989136 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerStarted","Data":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} Mar 20 13:27:28 crc kubenswrapper[4849]: I0320 13:27:28.989149 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:28 crc kubenswrapper[4849]: E0320 13:27:28.989363 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:30 crc kubenswrapper[4849]: I0320 13:27:30.035683 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:30 crc kubenswrapper[4849]: I0320 13:27:30.035745 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:30 crc kubenswrapper[4849]: E0320 13:27:30.037033 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:27:30 crc kubenswrapper[4849]: E0320 13:27:30.037188 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:27:30 crc kubenswrapper[4849]: I0320 13:27:30.035745 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:30 crc kubenswrapper[4849]: E0320 13:27:30.037667 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:27:31 crc kubenswrapper[4849]: I0320 13:27:31.035572 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:31 crc kubenswrapper[4849]: E0320 13:27:31.036740 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vm768" podUID="8ca35818-87a2-4dac-ad57-310ffe701961" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.035343 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.035356 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.036056 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.038838 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.039137 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.039480 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:27:32 crc kubenswrapper[4849]: I0320 13:27:32.039525 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:27:33 crc kubenswrapper[4849]: I0320 13:27:33.036005 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:33 crc kubenswrapper[4849]: I0320 13:27:33.038673 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:27:33 crc kubenswrapper[4849]: I0320 13:27:33.038715 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.075296 4849 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.107204 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podStartSLOduration=151.107183728 podStartE2EDuration="2m31.107183728s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:29.019042503 +0000 UTC m=+198.696765918" watchObservedRunningTime="2026-03-20 13:27:34.107183728 +0000 UTC m=+203.784907123" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.108323 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g4582"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.109210 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.109296 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.109748 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.110494 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.110907 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.115070 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.119293 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.119509 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.132092 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ztzl5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.132862 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.134155 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.135138 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.135384 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.135713 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.135943 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.139385 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.139912 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.140235 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.140466 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.140630 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.140949 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.141635 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.141926 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.142089 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.142170 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.142260 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.142453 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.142580 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.143427 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146142 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146290 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146461 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146724 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146857 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.147040 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.147206 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146729 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146858 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146898 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146982 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.146474 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.150050 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.150212 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.150539 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chdbf"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.150948 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.151286 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.153810 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d5hbn"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.154519 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.155843 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhpg\" (UniqueName: \"kubernetes.io/projected/060c5f02-9012-48d7-9f95-3677026da844-kube-api-access-kkhpg\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.155891 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-serving-cert\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.155922 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-audit\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.155963 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-oauth-config\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.155991 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-audit-policies\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156022 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd2a2c36-750c-426a-acfa-7359c0719805-config\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156051 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-etcd-client\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156076 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3071ec75-8957-46ad-8604-eaccf482cf02-audit-dir\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156099 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbcd2\" (UniqueName: \"kubernetes.io/projected/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-kube-api-access-nbcd2\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156125 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/acf1aff9-e595-444a-965c-a95c02348257-audit-dir\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156147 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-etcd-serving-ca\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156171 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753e3beb-9e10-4739-ad79-6ac49313ca7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156195 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-oauth-serving-cert\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156219 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-trusted-ca-bundle\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156242 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156263 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98slx\" (UniqueName: \"kubernetes.io/projected/acf1aff9-e595-444a-965c-a95c02348257-kube-api-access-98slx\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156310 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156332 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b51068-69a5-456b-8594-202190bd605e-serving-cert\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156355 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-service-ca-bundle\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156377 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskcg\" (UniqueName: \"kubernetes.io/projected/cd2a2c36-750c-426a-acfa-7359c0719805-kube-api-access-gskcg\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156401 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-config\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156426 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-image-import-ca\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156461 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-serving-cert\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156483 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8lpt\" (UniqueName: \"kubernetes.io/projected/3071ec75-8957-46ad-8604-eaccf482cf02-kube-api-access-m8lpt\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156504 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd2a2c36-750c-426a-acfa-7359c0719805-auth-proxy-config\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156526 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-config\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156548 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-config\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156570 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/acf1aff9-e595-444a-965c-a95c02348257-node-pullsecrets\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156594 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156616 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cd2a2c36-750c-426a-acfa-7359c0719805-machine-approver-tls\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156638 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-serving-cert\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156660 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpx9h\" (UniqueName: \"kubernetes.io/projected/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-kube-api-access-hpx9h\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156698 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/060c5f02-9012-48d7-9f95-3677026da844-serving-cert\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156732 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156759 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-client-ca\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156780 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-encryption-config\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156801 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-config\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156846 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpp55\" (UniqueName: \"kubernetes.io/projected/d7b51068-69a5-456b-8594-202190bd605e-kube-api-access-jpp55\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156869 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-etcd-client\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156890 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-client-ca\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156912 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156937 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156962 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-encryption-config\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156986 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-config\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.159767 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156591 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.160446 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.160866 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-service-ca\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.160910 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frd9f\" (UniqueName: \"kubernetes.io/projected/753e3beb-9e10-4739-ad79-6ac49313ca7b-kube-api-access-frd9f\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.160930 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.162536 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.162754 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tb9sj"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.163308 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.156741 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.157065 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.158243 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.163681 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vrpx2"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.158960 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.164028 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.164764 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.164776 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.165313 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.165728 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.166178 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.172736 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.173157 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.173203 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.174210 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h75cn"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.174341 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.175041 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.175111 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.175487 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178178 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178441 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178523 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178593 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178732 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178847 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.178921 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179018 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179057 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179138 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179220 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179139 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179308 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-568bd"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.180002 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179317 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179231 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179354 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179388 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179416 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179434 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179447 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179467 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179480 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179496 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179511 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179540 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179571 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179605 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179615 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179659 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.179695 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.180007 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.181263 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mwgqm"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.200026 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.203573 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfr7p"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.216467 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.228044 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.228066 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.228182 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229112 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229199 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229456 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229705 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229789 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229888 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229709 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.230056 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.230122 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229802 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.229733 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.230030 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.230965 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k4s2s"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.231073 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.232119 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.232195 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.232313 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.232689 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.233428 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.237576 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.238125 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.238214 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.238616 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.238760 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.238942 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.239082 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.239252 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.239470 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.242499 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.242730 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9lhk8"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.243470 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.243622 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.243529 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ttnt5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.244934 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.245026 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.245638 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.246139 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.251903 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.252098 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.253931 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.254405 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.254510 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.254563 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.254789 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.254937 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.256523 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.257061 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.262880 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753e3beb-9e10-4739-ad79-6ac49313ca7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.262934 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-etcd-serving-ca\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263001 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263041 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4ce29d69-5989-485a-9da1-3db91f1030fd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263070 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5jr\" (UniqueName: \"kubernetes.io/projected/08e134c2-39cf-49e0-949e-b43e2de6eda3-kube-api-access-9p5jr\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263097 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-oauth-serving-cert\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263122 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db727f58-5ed2-4e4f-88d1-5df962353c84-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263144 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2m6s\" (UniqueName: \"kubernetes.io/projected/e0a6353b-f7df-4ef2-b5c0-e52f35646aba-kube-api-access-m2m6s\") pod \"downloads-7954f5f757-mwgqm\" (UID: \"e0a6353b-f7df-4ef2-b5c0-e52f35646aba\") " pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263175 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-trusted-ca-bundle\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263199 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263223 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-trusted-ca\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263251 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98slx\" (UniqueName: \"kubernetes.io/projected/acf1aff9-e595-444a-965c-a95c02348257-kube-api-access-98slx\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263276 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263298 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b51068-69a5-456b-8594-202190bd605e-serving-cert\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263321 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-service-ca-bundle\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263352 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263378 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d0b2328a-5711-41d8-b908-79389f55898e-etcd-client\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263401 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7f3b98b0-5152-4678-bf63-a92ab6759fd4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263431 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskcg\" (UniqueName: \"kubernetes.io/projected/cd2a2c36-750c-426a-acfa-7359c0719805-kube-api-access-gskcg\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263459 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bljm6\" (UniqueName: \"kubernetes.io/projected/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-kube-api-access-bljm6\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263488 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb92d3ea-ece6-4afc-ac78-3a35f9635095-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263532 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-serving-cert\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263561 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8lpt\" (UniqueName: \"kubernetes.io/projected/3071ec75-8957-46ad-8604-eaccf482cf02-kube-api-access-m8lpt\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263588 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-config\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263611 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-image-import-ca\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263648 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263675 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0d7c79-e28e-436d-be90-8633bef20e8f-proxy-tls\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263716 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263741 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db727f58-5ed2-4e4f-88d1-5df962353c84-config\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263769 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd2a2c36-750c-426a-acfa-7359c0719805-auth-proxy-config\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263794 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-config\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263837 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263870 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-config\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263896 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/acf1aff9-e595-444a-965c-a95c02348257-node-pullsecrets\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263921 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263947 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cd2a2c36-750c-426a-acfa-7359c0719805-machine-approver-tls\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.263975 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tljv\" (UniqueName: \"kubernetes.io/projected/62c47fea-7805-489a-addc-f3e37eb30b7e-kube-api-access-2tljv\") pod \"migrator-59844c95c7-5j2ck\" (UID: \"62c47fea-7805-489a-addc-f3e37eb30b7e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.264004 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-serving-cert\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.264030 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e921bc-e295-4ace-a807-8768ed476321-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.264172 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-etcd-serving-ca\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.264726 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-service-ca-bundle\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.264809 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-trusted-ca-bundle\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265150 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-oauth-serving-cert\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265450 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd2a2c36-750c-426a-acfa-7359c0719805-auth-proxy-config\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265455 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-n5ktf"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265900 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fdv6\" (UniqueName: \"kubernetes.io/projected/69e921bc-e295-4ace-a807-8768ed476321-kube-api-access-4fdv6\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265945 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-config\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265951 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.265991 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb92d3ea-ece6-4afc-ac78-3a35f9635095-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266013 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e134c2-39cf-49e0-949e-b43e2de6eda3-config\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266098 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-config\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266395 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpx9h\" (UniqueName: \"kubernetes.io/projected/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-kube-api-access-hpx9h\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266419 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e5f2143-15f8-487f-96ab-0242615ed791-metrics-tls\") pod \"dns-operator-744455d44c-9lhk8\" (UID: \"9e5f2143-15f8-487f-96ab-0242615ed791\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266442 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bk5v\" (UniqueName: \"kubernetes.io/projected/7ec0bd3b-8a73-446d-8fb6-53c537db79f0-kube-api-access-4bk5v\") pod \"control-plane-machine-set-operator-78cbb6b69f-hv7bs\" (UID: \"7ec0bd3b-8a73-446d-8fb6-53c537db79f0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266479 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266514 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/060c5f02-9012-48d7-9f95-3677026da844-serving-cert\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266531 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266596 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266611 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/acf1aff9-e595-444a-965c-a95c02348257-node-pullsecrets\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.266646 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hfct\" (UniqueName: \"kubernetes.io/projected/9e5f2143-15f8-487f-96ab-0242615ed791-kube-api-access-5hfct\") pod \"dns-operator-744455d44c-9lhk8\" (UID: \"9e5f2143-15f8-487f-96ab-0242615ed791\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.271274 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srfb\" (UniqueName: \"kubernetes.io/projected/f4f2310e-2f2f-4d0b-97a7-3b740a881646-kube-api-access-2srfb\") pod \"cluster-samples-operator-665b6dd947-qxbwf\" (UID: \"f4f2310e-2f2f-4d0b-97a7-3b740a881646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.271354 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb92d3ea-ece6-4afc-ac78-3a35f9635095-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.271378 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z575w\" (UniqueName: \"kubernetes.io/projected/4ce29d69-5989-485a-9da1-3db91f1030fd-kube-api-access-z575w\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.272170 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-config\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.272417 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-image-import-ca\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.276420 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-serving-cert\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.276615 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753e3beb-9e10-4739-ad79-6ac49313ca7b-serving-cert\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.277207 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566886-7cjjt"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.282689 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7b51068-69a5-456b-8594-202190bd605e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.283091 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-serving-cert\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.283347 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.284917 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285168 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-client-ca\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285318 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-encryption-config\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285352 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-config\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285405 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7b51068-69a5-456b-8594-202190bd605e-serving-cert\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285486 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285516 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285655 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-config\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285927 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpp55\" (UniqueName: \"kubernetes.io/projected/d7b51068-69a5-456b-8594-202190bd605e-kube-api-access-jpp55\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.285964 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-config\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286105 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-metrics-tls\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286139 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpp46\" (UniqueName: \"kubernetes.io/projected/7b0d7c79-e28e-436d-be90-8633bef20e8f-kube-api-access-zpp46\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286276 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-etcd-client\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286303 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b0d7c79-e28e-436d-be90-8633bef20e8f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286431 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-policies\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286466 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-client-ca\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286595 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.286626 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db727f58-5ed2-4e4f-88d1-5df962353c84-images\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287486 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287527 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-encryption-config\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287659 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklw4\" (UniqueName: \"kubernetes.io/projected/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-kube-api-access-qklw4\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287685 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-dir\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287840 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287869 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0b2328a-5711-41d8-b908-79389f55898e-serving-cert\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.287895 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqd2k\" (UniqueName: \"kubernetes.io/projected/7f3b98b0-5152-4678-bf63-a92ab6759fd4-kube-api-access-dqd2k\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288026 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288053 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmft\" (UniqueName: \"kubernetes.io/projected/bb92d3ea-ece6-4afc-ac78-3a35f9635095-kube-api-access-slmft\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288188 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-config\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288282 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-client-ca\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288315 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-service-ca\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288360 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-bound-sa-token\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288508 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frd9f\" (UniqueName: \"kubernetes.io/projected/753e3beb-9e10-4739-ad79-6ac49313ca7b-kube-api-access-frd9f\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288535 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288668 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f3b98b0-5152-4678-bf63-a92ab6759fd4-serving-cert\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288698 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce29d69-5989-485a-9da1-3db91f1030fd-proxy-tls\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.288887 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xfgn4"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289018 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e134c2-39cf-49e0-949e-b43e2de6eda3-serving-cert\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289058 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhpg\" (UniqueName: \"kubernetes.io/projected/060c5f02-9012-48d7-9f95-3677026da844-kube-api-access-kkhpg\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289196 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e921bc-e295-4ace-a807-8768ed476321-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289222 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b0d7c79-e28e-436d-be90-8633bef20e8f-images\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289792 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-encryption-config\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289928 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-serving-cert\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.289968 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.290222 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.290277 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-oauth-config\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.290319 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-audit\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.290360 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec0bd3b-8a73-446d-8fb6-53c537db79f0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hv7bs\" (UID: \"7ec0bd3b-8a73-446d-8fb6-53c537db79f0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.290403 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfxvp\" (UniqueName: \"kubernetes.io/projected/db727f58-5ed2-4e4f-88d1-5df962353c84-kube-api-access-vfxvp\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.290432 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08e134c2-39cf-49e0-949e-b43e2de6eda3-trusted-ca\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.291022 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.291146 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.296768 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.317479 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.354261 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-trusted-ca-bundle\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.356144 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358620 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-etcd-client\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358697 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-audit-policies\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358725 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358769 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd2a2c36-750c-426a-acfa-7359c0719805-config\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358789 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358807 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358843 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22n82\" (UniqueName: \"kubernetes.io/projected/d0b2328a-5711-41d8-b908-79389f55898e-kube-api-access-22n82\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358873 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-etcd-client\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358891 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3071ec75-8957-46ad-8604-eaccf482cf02-audit-dir\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358907 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbcd2\" (UniqueName: \"kubernetes.io/projected/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-kube-api-access-nbcd2\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358924 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccnh5\" (UniqueName: \"kubernetes.io/projected/f1f2af94-ce72-498b-a231-d171ab0e8760-kube-api-access-ccnh5\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358942 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-etcd-ca\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358965 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/acf1aff9-e595-444a-965c-a95c02348257-audit-dir\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358982 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4f2310e-2f2f-4d0b-97a7-3b740a881646-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qxbwf\" (UID: \"f4f2310e-2f2f-4d0b-97a7-3b740a881646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.358999 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.359015 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-etcd-service-ca\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.359303 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-service-ca\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.360131 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3071ec75-8957-46ad-8604-eaccf482cf02-audit-dir\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.360273 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/acf1aff9-e595-444a-965c-a95c02348257-audit-dir\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.360310 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.362099 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/cd2a2c36-750c-426a-acfa-7359c0719805-machine-approver-tls\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.362661 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-audit\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.364320 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-config\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.364979 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-oauth-config\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.368861 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acf1aff9-e595-444a-965c-a95c02348257-config\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.371225 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.371517 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.371515 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd2a2c36-750c-426a-acfa-7359c0719805-config\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.371675 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.371788 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.372000 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3071ec75-8957-46ad-8604-eaccf482cf02-audit-policies\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.372209 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.372590 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.372957 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.291887 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/060c5f02-9012-48d7-9f95-3677026da844-serving-cert\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.373156 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.373407 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.375196 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.377380 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.377876 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-serving-cert\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.379344 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3071ec75-8957-46ad-8604-eaccf482cf02-etcd-client\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.380116 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-client-ca\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.384188 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5747m"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.385006 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.385908 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/acf1aff9-e595-444a-965c-a95c02348257-encryption-config\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.387305 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.388008 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.388929 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.389551 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.392512 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.393102 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.395409 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.395994 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.397423 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.397479 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v8tw5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.398150 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.401282 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.402559 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.402967 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r8cpv"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.403511 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.404212 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chdbf"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.405726 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.406948 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g4582"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.409092 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tb9sj"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.409811 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.411220 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ztzl5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.411744 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.412663 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.414047 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.415527 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mwgqm"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.416802 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.418149 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.419512 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h75cn"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.420966 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9lhk8"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.422290 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vrpx2"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.423552 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.424838 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.426029 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.427282 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ttnt5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.428728 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.431594 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.432160 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.434039 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-7cjjt"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.435894 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xfgn4"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.437352 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.438773 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pmc7z"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.440066 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.440244 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.441406 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.452804 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.453187 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-568bd"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.455665 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.457917 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.459574 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k4s2s"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460512 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460576 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22n82\" (UniqueName: \"kubernetes.io/projected/d0b2328a-5711-41d8-b908-79389f55898e-kube-api-access-22n82\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460598 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460619 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460652 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccnh5\" (UniqueName: \"kubernetes.io/projected/f1f2af94-ce72-498b-a231-d171ab0e8760-kube-api-access-ccnh5\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460677 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-etcd-ca\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460703 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4f2310e-2f2f-4d0b-97a7-3b740a881646-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qxbwf\" (UID: \"f4f2310e-2f2f-4d0b-97a7-3b740a881646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460728 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460747 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-etcd-service-ca\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460772 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460795 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4ce29d69-5989-485a-9da1-3db91f1030fd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460838 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5jr\" (UniqueName: \"kubernetes.io/projected/08e134c2-39cf-49e0-949e-b43e2de6eda3-kube-api-access-9p5jr\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460860 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db727f58-5ed2-4e4f-88d1-5df962353c84-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460882 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2m6s\" (UniqueName: \"kubernetes.io/projected/e0a6353b-f7df-4ef2-b5c0-e52f35646aba-kube-api-access-m2m6s\") pod \"downloads-7954f5f757-mwgqm\" (UID: \"e0a6353b-f7df-4ef2-b5c0-e52f35646aba\") " pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460907 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-trusted-ca\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460924 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfr7p"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460935 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460964 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d0b2328a-5711-41d8-b908-79389f55898e-etcd-client\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.460994 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7f3b98b0-5152-4678-bf63-a92ab6759fd4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461054 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bljm6\" (UniqueName: \"kubernetes.io/projected/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-kube-api-access-bljm6\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461079 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb92d3ea-ece6-4afc-ac78-3a35f9635095-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461122 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461146 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db727f58-5ed2-4e4f-88d1-5df962353c84-config\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461173 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461199 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0d7c79-e28e-436d-be90-8633bef20e8f-proxy-tls\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461223 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461251 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tljv\" (UniqueName: \"kubernetes.io/projected/62c47fea-7805-489a-addc-f3e37eb30b7e-kube-api-access-2tljv\") pod \"migrator-59844c95c7-5j2ck\" (UID: \"62c47fea-7805-489a-addc-f3e37eb30b7e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461274 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461294 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb92d3ea-ece6-4afc-ac78-3a35f9635095-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461317 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e134c2-39cf-49e0-949e-b43e2de6eda3-config\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461339 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e921bc-e295-4ace-a807-8768ed476321-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461363 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fdv6\" (UniqueName: \"kubernetes.io/projected/69e921bc-e295-4ace-a807-8768ed476321-kube-api-access-4fdv6\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461402 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e5f2143-15f8-487f-96ab-0242615ed791-metrics-tls\") pod \"dns-operator-744455d44c-9lhk8\" (UID: \"9e5f2143-15f8-487f-96ab-0242615ed791\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461427 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461450 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bk5v\" (UniqueName: \"kubernetes.io/projected/7ec0bd3b-8a73-446d-8fb6-53c537db79f0-kube-api-access-4bk5v\") pod \"control-plane-machine-set-operator-78cbb6b69f-hv7bs\" (UID: \"7ec0bd3b-8a73-446d-8fb6-53c537db79f0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461479 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461505 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hfct\" (UniqueName: \"kubernetes.io/projected/9e5f2143-15f8-487f-96ab-0242615ed791-kube-api-access-5hfct\") pod \"dns-operator-744455d44c-9lhk8\" (UID: \"9e5f2143-15f8-487f-96ab-0242615ed791\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461530 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srfb\" (UniqueName: \"kubernetes.io/projected/f4f2310e-2f2f-4d0b-97a7-3b740a881646-kube-api-access-2srfb\") pod \"cluster-samples-operator-665b6dd947-qxbwf\" (UID: \"f4f2310e-2f2f-4d0b-97a7-3b740a881646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461556 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb92d3ea-ece6-4afc-ac78-3a35f9635095-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461581 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z575w\" (UniqueName: \"kubernetes.io/projected/4ce29d69-5989-485a-9da1-3db91f1030fd-kube-api-access-z575w\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461616 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461641 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461666 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-config\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461698 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-config\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461720 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-metrics-tls\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461740 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpp46\" (UniqueName: \"kubernetes.io/projected/7b0d7c79-e28e-436d-be90-8633bef20e8f-kube-api-access-zpp46\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461760 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b0d7c79-e28e-436d-be90-8633bef20e8f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461784 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-policies\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461806 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db727f58-5ed2-4e4f-88d1-5df962353c84-images\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461844 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklw4\" (UniqueName: \"kubernetes.io/projected/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-kube-api-access-qklw4\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461866 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-dir\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461886 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461907 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0b2328a-5711-41d8-b908-79389f55898e-serving-cert\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461929 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqd2k\" (UniqueName: \"kubernetes.io/projected/7f3b98b0-5152-4678-bf63-a92ab6759fd4-kube-api-access-dqd2k\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461952 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.461977 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slmft\" (UniqueName: \"kubernetes.io/projected/bb92d3ea-ece6-4afc-ac78-3a35f9635095-kube-api-access-slmft\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462000 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-bound-sa-token\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462021 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f3b98b0-5152-4678-bf63-a92ab6759fd4-serving-cert\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462042 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce29d69-5989-485a-9da1-3db91f1030fd-proxy-tls\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462068 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e134c2-39cf-49e0-949e-b43e2de6eda3-serving-cert\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462106 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e921bc-e295-4ace-a807-8768ed476321-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462128 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b0d7c79-e28e-436d-be90-8633bef20e8f-images\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462153 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462177 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462214 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec0bd3b-8a73-446d-8fb6-53c537db79f0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hv7bs\" (UID: \"7ec0bd3b-8a73-446d-8fb6-53c537db79f0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462235 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfxvp\" (UniqueName: \"kubernetes.io/projected/db727f58-5ed2-4e4f-88d1-5df962353c84-kube-api-access-vfxvp\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462255 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08e134c2-39cf-49e0-949e-b43e2de6eda3-trusted-ca\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462334 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.463094 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bb92d3ea-ece6-4afc-ac78-3a35f9635095-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.463447 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-trusted-ca\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.463493 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7f3b98b0-5152-4678-bf63-a92ab6759fd4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.463667 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464045 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/08e134c2-39cf-49e0-949e-b43e2de6eda3-trusted-ca\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464311 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-policies\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464544 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4ce29d69-5989-485a-9da1-3db91f1030fd-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.462032 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464729 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d5hbn"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464760 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-dir\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464791 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db727f58-5ed2-4e4f-88d1-5df962353c84-config\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.464973 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7b0d7c79-e28e-436d-be90-8633bef20e8f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.465002 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/db727f58-5ed2-4e4f-88d1-5df962353c84-images\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.465349 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.465996 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e921bc-e295-4ace-a807-8768ed476321-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.466177 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08e134c2-39cf-49e0-949e-b43e2de6eda3-config\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.466291 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.467185 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.467569 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r8cpv"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.467738 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08e134c2-39cf-49e0-949e-b43e2de6eda3-serving-cert\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.468005 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.468000 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4f2310e-2f2f-4d0b-97a7-3b740a881646-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-qxbwf\" (UID: \"f4f2310e-2f2f-4d0b-97a7-3b740a881646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.468166 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.468842 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-config\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.469154 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v8tw5"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.470050 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.470282 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/db727f58-5ed2-4e4f-88d1-5df962353c84-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.470635 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb92d3ea-ece6-4afc-ac78-3a35f9635095-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.470915 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.470957 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5747m"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.471246 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.471685 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.472767 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pmc7z"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.473088 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.473508 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.474629 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7b6nh"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.476025 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-9zhrn"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.476213 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.476537 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.477373 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7b6nh"] Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.477981 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.478029 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e921bc-e295-4ace-a807-8768ed476321-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.478235 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f3b98b0-5152-4678-bf63-a92ab6759fd4-serving-cert\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.478668 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-metrics-tls\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.480978 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.491882 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.511318 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.515087 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.532100 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.551268 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.572772 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.577888 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0b2328a-5711-41d8-b908-79389f55898e-serving-cert\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.593182 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.606381 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d0b2328a-5711-41d8-b908-79389f55898e-etcd-client\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.611649 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.619105 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-config\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.632018 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.642449 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-etcd-ca\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.652052 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.663410 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d0b2328a-5711-41d8-b908-79389f55898e-etcd-service-ca\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.671566 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.712003 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.732305 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.751465 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.772041 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.783274 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9e5f2143-15f8-487f-96ab-0242615ed791-metrics-tls\") pod \"dns-operator-744455d44c-9lhk8\" (UID: \"9e5f2143-15f8-487f-96ab-0242615ed791\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.791161 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.811427 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.831414 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.851885 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.871984 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.892036 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.911465 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.918790 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce29d69-5989-485a-9da1-3db91f1030fd-proxy-tls\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.932466 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.951728 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.972394 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.982344 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ec0bd3b-8a73-446d-8fb6-53c537db79f0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hv7bs\" (UID: \"7ec0bd3b-8a73-446d-8fb6-53c537db79f0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.992238 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:27:34 crc kubenswrapper[4849]: I0320 13:27:34.996973 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0d7c79-e28e-436d-be90-8633bef20e8f-proxy-tls\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.011012 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.015310 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7b0d7c79-e28e-436d-be90-8633bef20e8f-images\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.031289 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.052196 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.071609 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.079023 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.092209 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.110969 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.114676 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.151925 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.171245 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.192888 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.211520 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.245764 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskcg\" (UniqueName: \"kubernetes.io/projected/cd2a2c36-750c-426a-acfa-7359c0719805-kube-api-access-gskcg\") pod \"machine-approver-56656f9798-hcx8v\" (UID: \"cd2a2c36-750c-426a-acfa-7359c0719805\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.262912 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8lpt\" (UniqueName: \"kubernetes.io/projected/3071ec75-8957-46ad-8604-eaccf482cf02-kube-api-access-m8lpt\") pod \"apiserver-7bbb656c7d-5dq64\" (UID: \"3071ec75-8957-46ad-8604-eaccf482cf02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.284937 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98slx\" (UniqueName: \"kubernetes.io/projected/acf1aff9-e595-444a-965c-a95c02348257-kube-api-access-98slx\") pod \"apiserver-76f77b778f-g4582\" (UID: \"acf1aff9-e595-444a-965c-a95c02348257\") " pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.290021 4849 request.go:700] Waited for 1.00900476s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.309259 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpx9h\" (UniqueName: \"kubernetes.io/projected/303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6-kube-api-access-hpx9h\") pod \"openshift-apiserver-operator-796bbdcf4f-8tznr\" (UID: \"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.311135 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.332175 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.337174 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.352653 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.371520 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.375322 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.393791 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.411213 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.433161 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.445054 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.463586 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.466141 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpp55\" (UniqueName: \"kubernetes.io/projected/d7b51068-69a5-456b-8594-202190bd605e-kube-api-access-jpp55\") pod \"authentication-operator-69f744f599-chdbf\" (UID: \"d7b51068-69a5-456b-8594-202190bd605e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.496320 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhpg\" (UniqueName: \"kubernetes.io/projected/060c5f02-9012-48d7-9f95-3677026da844-kube-api-access-kkhpg\") pod \"controller-manager-879f6c89f-d5hbn\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.509753 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frd9f\" (UniqueName: \"kubernetes.io/projected/753e3beb-9e10-4739-ad79-6ac49313ca7b-kube-api-access-frd9f\") pod \"route-controller-manager-6576b87f9c-lrbs8\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.531591 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.532071 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbcd2\" (UniqueName: \"kubernetes.io/projected/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-kube-api-access-nbcd2\") pod \"console-f9d7485db-ztzl5\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.553641 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.571706 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.587251 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.592371 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.610900 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.621000 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-g4582"] Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.632082 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: W0320 13:27:35.641168 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacf1aff9_e595_444a_965c_a95c02348257.slice/crio-c2c6e8eefd0fda3cfcc07211d2be46be46d39e6275499b63919825db9fc1a3d6 WatchSource:0}: Error finding container c2c6e8eefd0fda3cfcc07211d2be46be46d39e6275499b63919825db9fc1a3d6: Status 404 returned error can't find the container with id c2c6e8eefd0fda3cfcc07211d2be46be46d39e6275499b63919825db9fc1a3d6 Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.646570 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64"] Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.650601 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:27:35 crc kubenswrapper[4849]: W0320 13:27:35.657854 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3071ec75_8957_46ad_8604_eaccf482cf02.slice/crio-d6e514daef923b2204a4e7b5e902efefd1a7800b46bdfa5349167e5c647e2667 WatchSource:0}: Error finding container d6e514daef923b2204a4e7b5e902efefd1a7800b46bdfa5349167e5c647e2667: Status 404 returned error can't find the container with id d6e514daef923b2204a4e7b5e902efefd1a7800b46bdfa5349167e5c647e2667 Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.659989 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.671399 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.692283 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.692646 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr"] Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.711429 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.731574 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.742987 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.753701 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.753780 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.766251 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d5hbn"] Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.771571 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:27:35 crc kubenswrapper[4849]: W0320 13:27:35.781111 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod060c5f02_9012_48d7_9f95_3677026da844.slice/crio-7955d9f9a67af7d4193d2de65862c7a4fd11a9919514595f7e7bed7d4a49bbb7 WatchSource:0}: Error finding container 7955d9f9a67af7d4193d2de65862c7a4fd11a9919514595f7e7bed7d4a49bbb7: Status 404 returned error can't find the container with id 7955d9f9a67af7d4193d2de65862c7a4fd11a9919514595f7e7bed7d4a49bbb7 Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.792461 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.815286 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.831526 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.837072 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8"] Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.858040 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.875585 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.891970 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.911205 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.931376 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.952276 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.972621 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.991474 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:27:35 crc kubenswrapper[4849]: I0320 13:27:35.997068 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-chdbf"] Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.009354 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ztzl5"] Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.010772 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.013548 4849 generic.go:334] "Generic (PLEG): container finished" podID="acf1aff9-e595-444a-965c-a95c02348257" containerID="caf881ef68645345093735c085cf5e7e3506c370e3a3df1075271cf4d16db68e" exitCode=0 Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.013585 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g4582" event={"ID":"acf1aff9-e595-444a-965c-a95c02348257","Type":"ContainerDied","Data":"caf881ef68645345093735c085cf5e7e3506c370e3a3df1075271cf4d16db68e"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.013618 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g4582" event={"ID":"acf1aff9-e595-444a-965c-a95c02348257","Type":"ContainerStarted","Data":"c2c6e8eefd0fda3cfcc07211d2be46be46d39e6275499b63919825db9fc1a3d6"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.015326 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" event={"ID":"060c5f02-9012-48d7-9f95-3677026da844","Type":"ContainerStarted","Data":"da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.015367 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" event={"ID":"060c5f02-9012-48d7-9f95-3677026da844","Type":"ContainerStarted","Data":"7955d9f9a67af7d4193d2de65862c7a4fd11a9919514595f7e7bed7d4a49bbb7"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.015563 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.016842 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" event={"ID":"cd2a2c36-750c-426a-acfa-7359c0719805","Type":"ContainerStarted","Data":"ec2807312dade83cb808bba48f2a78e2275b841e0bec6e4c9bf0e13bd5e07fb7"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.016870 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" event={"ID":"cd2a2c36-750c-426a-acfa-7359c0719805","Type":"ContainerStarted","Data":"c605376eef69c71e61e36340c7ddc3c4a189ec1fd2fb700665ad647aa764bac9"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.017915 4849 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d5hbn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.017954 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.018252 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" event={"ID":"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6","Type":"ContainerStarted","Data":"c95ce57eeb6798f0374bffe5e1fcb11d819377ff6ba37dd1defed2372b46c5a8"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.018281 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" event={"ID":"303f3e4d-a8b3-4c0a-a3dc-f3b80bfef8f6","Type":"ContainerStarted","Data":"7c21b77f6512a2dacb65ffac0ddacc05f36e2b7257469226542155ed73386b1b"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.019763 4849 generic.go:334] "Generic (PLEG): container finished" podID="3071ec75-8957-46ad-8604-eaccf482cf02" containerID="b3ebba4c3f3fc00fcc0302b7ed6ed47cc4346efb993f4542d7f31b58ea44dbb6" exitCode=0 Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.019855 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" event={"ID":"3071ec75-8957-46ad-8604-eaccf482cf02","Type":"ContainerDied","Data":"b3ebba4c3f3fc00fcc0302b7ed6ed47cc4346efb993f4542d7f31b58ea44dbb6"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.019878 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" event={"ID":"3071ec75-8957-46ad-8604-eaccf482cf02","Type":"ContainerStarted","Data":"d6e514daef923b2204a4e7b5e902efefd1a7800b46bdfa5349167e5c647e2667"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.021689 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" event={"ID":"753e3beb-9e10-4739-ad79-6ac49313ca7b","Type":"ContainerStarted","Data":"24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.021713 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" event={"ID":"753e3beb-9e10-4739-ad79-6ac49313ca7b","Type":"ContainerStarted","Data":"8c0af0a95631cb2f64fcaca2ee7cc8a8411e95513cf8e91d125ace49a605df0d"} Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.021845 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.022941 4849 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lrbs8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.022980 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.035865 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.051713 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:27:36 crc kubenswrapper[4849]: W0320 13:27:36.068616 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7b51068_69a5_456b_8594_202190bd605e.slice/crio-d494082da0c76e680e72dba345d451229b975f4ad72c06da65f7b8fefc411606 WatchSource:0}: Error finding container d494082da0c76e680e72dba345d451229b975f4ad72c06da65f7b8fefc411606: Status 404 returned error can't find the container with id d494082da0c76e680e72dba345d451229b975f4ad72c06da65f7b8fefc411606 Mar 20 13:27:36 crc kubenswrapper[4849]: W0320 13:27:36.069939 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod200191b3_9ea4_4ed7_b4b1_05e8ce9d3537.slice/crio-5da13af49286494d17e3f05668954aa773d596e0dc25776a32d0f00c8ad64a9f WatchSource:0}: Error finding container 5da13af49286494d17e3f05668954aa773d596e0dc25776a32d0f00c8ad64a9f: Status 404 returned error can't find the container with id 5da13af49286494d17e3f05668954aa773d596e0dc25776a32d0f00c8ad64a9f Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.071663 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.091467 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.113076 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.131266 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.155071 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.171996 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.223547 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.224046 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.232141 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.278485 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2m6s\" (UniqueName: \"kubernetes.io/projected/e0a6353b-f7df-4ef2-b5c0-e52f35646aba-kube-api-access-m2m6s\") pod \"downloads-7954f5f757-mwgqm\" (UID: \"e0a6353b-f7df-4ef2-b5c0-e52f35646aba\") " pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.290043 4849 request.go:700] Waited for 1.828431724s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/serviceaccounts/console-operator/token Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.291347 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccnh5\" (UniqueName: \"kubernetes.io/projected/f1f2af94-ce72-498b-a231-d171ab0e8760-kube-api-access-ccnh5\") pod \"oauth-openshift-558db77b4-lfr7p\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.310774 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5jr\" (UniqueName: \"kubernetes.io/projected/08e134c2-39cf-49e0-949e-b43e2de6eda3-kube-api-access-9p5jr\") pod \"console-operator-58897d9998-vrpx2\" (UID: \"08e134c2-39cf-49e0-949e-b43e2de6eda3\") " pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.330396 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22n82\" (UniqueName: \"kubernetes.io/projected/d0b2328a-5711-41d8-b908-79389f55898e-kube-api-access-22n82\") pod \"etcd-operator-b45778765-k4s2s\" (UID: \"d0b2328a-5711-41d8-b908-79389f55898e\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.352722 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tljv\" (UniqueName: \"kubernetes.io/projected/62c47fea-7805-489a-addc-f3e37eb30b7e-kube-api-access-2tljv\") pod \"migrator-59844c95c7-5j2ck\" (UID: \"62c47fea-7805-489a-addc-f3e37eb30b7e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.374326 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bljm6\" (UniqueName: \"kubernetes.io/projected/b7186050-ada2-4a8a-9d0d-c7059bc85a6e-kube-api-access-bljm6\") pod \"kube-storage-version-migrator-operator-b67b599dd-572xs\" (UID: \"b7186050-ada2-4a8a-9d0d-c7059bc85a6e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.388013 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-bound-sa-token\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.405996 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklw4\" (UniqueName: \"kubernetes.io/projected/fb670ffd-f8c9-4003-bf91-b4e36c7f1292-kube-api-access-qklw4\") pod \"ingress-operator-5b745b69d9-568bd\" (UID: \"fb670ffd-f8c9-4003-bf91-b4e36c7f1292\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.428528 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqd2k\" (UniqueName: \"kubernetes.io/projected/7f3b98b0-5152-4678-bf63-a92ab6759fd4-kube-api-access-dqd2k\") pod \"openshift-config-operator-7777fb866f-h75cn\" (UID: \"7f3b98b0-5152-4678-bf63-a92ab6759fd4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.450633 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srfb\" (UniqueName: \"kubernetes.io/projected/f4f2310e-2f2f-4d0b-97a7-3b740a881646-kube-api-access-2srfb\") pod \"cluster-samples-operator-665b6dd947-qxbwf\" (UID: \"f4f2310e-2f2f-4d0b-97a7-3b740a881646\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.465975 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9c6ce5d-e71c-4bcc-ad54-d74cb789b883-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5mkmd\" (UID: \"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.485650 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfxvp\" (UniqueName: \"kubernetes.io/projected/db727f58-5ed2-4e4f-88d1-5df962353c84-kube-api-access-vfxvp\") pod \"machine-api-operator-5694c8668f-tb9sj\" (UID: \"db727f58-5ed2-4e4f-88d1-5df962353c84\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.494008 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.503214 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.507123 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z575w\" (UniqueName: \"kubernetes.io/projected/4ce29d69-5989-485a-9da1-3db91f1030fd-kube-api-access-z575w\") pod \"machine-config-controller-84d6567774-ms4l4\" (UID: \"4ce29d69-5989-485a-9da1-3db91f1030fd\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.511920 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.529079 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb92d3ea-ece6-4afc-ac78-3a35f9635095-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.547304 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bk5v\" (UniqueName: \"kubernetes.io/projected/7ec0bd3b-8a73-446d-8fb6-53c537db79f0-kube-api-access-4bk5v\") pod \"control-plane-machine-set-operator-78cbb6b69f-hv7bs\" (UID: \"7ec0bd3b-8a73-446d-8fb6-53c537db79f0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.554468 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.567122 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.569320 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.571614 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fdv6\" (UniqueName: \"kubernetes.io/projected/69e921bc-e295-4ace-a807-8768ed476321-kube-api-access-4fdv6\") pod \"openshift-controller-manager-operator-756b6f6bc6-ss65v\" (UID: \"69e921bc-e295-4ace-a807-8768ed476321\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.583584 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.588312 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmft\" (UniqueName: \"kubernetes.io/projected/bb92d3ea-ece6-4afc-ac78-3a35f9635095-kube-api-access-slmft\") pod \"cluster-image-registry-operator-dc59b4c8b-blwkb\" (UID: \"bb92d3ea-ece6-4afc-ac78-3a35f9635095\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.591624 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.601498 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.608641 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.613723 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9556c251-d79c-4fec-a027-bf4aeb2fc4f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hpdcw\" (UID: \"9556c251-d79c-4fec-a027-bf4aeb2fc4f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.631247 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hfct\" (UniqueName: \"kubernetes.io/projected/9e5f2143-15f8-487f-96ab-0242615ed791-kube-api-access-5hfct\") pod \"dns-operator-744455d44c-9lhk8\" (UID: \"9e5f2143-15f8-487f-96ab-0242615ed791\") " pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.635034 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.644174 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.656030 4849 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.656473 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.659869 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpp46\" (UniqueName: \"kubernetes.io/projected/7b0d7c79-e28e-436d-be90-8633bef20e8f-kube-api-access-zpp46\") pod \"machine-config-operator-74547568cd-8nnjz\" (UID: \"7b0d7c79-e28e-436d-be90-8633bef20e8f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.677375 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.700855 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.721580 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.739965 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.753942 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.779042 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800100 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-registry-tls\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800158 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800211 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-trusted-ca\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800258 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w95sg\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-kube-api-access-w95sg\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800288 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-registry-certificates\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800310 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db498458-18d4-4142-b536-3141889616e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800361 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db498458-18d4-4142-b536-3141889616e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.800389 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-bound-sa-token\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: E0320 13:27:36.811305 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.311282778 +0000 UTC m=+206.989006183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.819698 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.822919 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tb9sj"] Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.833623 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.839503 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909246 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909490 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4d96b99-5115-44da-92cd-e05b9450b4f0-cert\") pod \"ingress-canary-r8cpv\" (UID: \"e4d96b99-5115-44da-92cd-e05b9450b4f0\") " pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909534 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-metrics-tls\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909592 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-secret-volume\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909629 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-registry-tls\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909686 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bb10ff4-21a1-499c-a25f-ce649548a3b4-srv-cert\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909711 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmj4\" (UniqueName: \"kubernetes.io/projected/4855b8cf-a062-487c-bf23-49fd7f919e7a-kube-api-access-4wmj4\") pod \"auto-csr-approver-29566886-7cjjt\" (UID: \"4855b8cf-a062-487c-bf23-49fd7f919e7a\") " pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909748 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzfk\" (UniqueName: \"kubernetes.io/projected/e4d96b99-5115-44da-92cd-e05b9450b4f0-kube-api-access-mqzfk\") pod \"ingress-canary-r8cpv\" (UID: \"e4d96b99-5115-44da-92cd-e05b9450b4f0\") " pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909765 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/efafcd75-fae8-4f2d-8721-f166037532c6-signing-key\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909863 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bb10ff4-21a1-499c-a25f-ce649548a3b4-profile-collector-cert\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909917 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d488e843-a3e6-48fd-ab56-98836840aa40-webhook-cert\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909945 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-socket-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.909962 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-registration-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910015 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a815dfc-2fe5-4207-82a4-fa8e0b237411-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ccw9b\" (UID: \"4a815dfc-2fe5-4207-82a4-fa8e0b237411\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910128 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs2rs\" (UniqueName: \"kubernetes.io/projected/3c193453-0ab4-4e10-8897-439f6a833645-kube-api-access-cs2rs\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910169 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-serving-cert\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910187 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-config-volume\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910202 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjtpj\" (UniqueName: \"kubernetes.io/projected/4a815dfc-2fe5-4207-82a4-fa8e0b237411-kube-api-access-zjtpj\") pod \"package-server-manager-789f6589d5-ccw9b\" (UID: \"4a815dfc-2fe5-4207-82a4-fa8e0b237411\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:36 crc kubenswrapper[4849]: E0320 13:27:36.910499 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.410469371 +0000 UTC m=+207.088192766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910533 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/993eca69-6343-4f10-95d0-7f2def6430d6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xfgn4\" (UID: \"993eca69-6343-4f10-95d0-7f2def6430d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910663 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-stats-auth\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910695 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-trusted-ca\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910834 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-config\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910894 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-mountpoint-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910936 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b676f291-aafa-48d4-a5ab-0943b69b7be4-srv-cert\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910975 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-default-certificate\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.910994 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-plugins-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911054 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w95sg\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-kube-api-access-w95sg\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911099 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4287\" (UniqueName: \"kubernetes.io/projected/efafcd75-fae8-4f2d-8721-f166037532c6-kube-api-access-l4287\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911129 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r9r9\" (UniqueName: \"kubernetes.io/projected/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-kube-api-access-4r9r9\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911193 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgk8q\" (UniqueName: \"kubernetes.io/projected/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-kube-api-access-rgk8q\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911237 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911692 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-config\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911719 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d488e843-a3e6-48fd-ab56-98836840aa40-tmpfs\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911749 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-registry-certificates\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911767 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52rfl\" (UniqueName: \"kubernetes.io/projected/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-kube-api-access-52rfl\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911832 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db498458-18d4-4142-b536-3141889616e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911853 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dkj\" (UniqueName: \"kubernetes.io/projected/b676f291-aafa-48d4-a5ab-0943b69b7be4-kube-api-access-s8dkj\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911869 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6hz\" (UniqueName: \"kubernetes.io/projected/79437aa6-d273-4649-ac1c-e8955b940576-kube-api-access-pv6hz\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911943 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d488e843-a3e6-48fd-ab56-98836840aa40-apiservice-cert\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.911960 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268kh\" (UniqueName: \"kubernetes.io/projected/993eca69-6343-4f10-95d0-7f2def6430d6-kube-api-access-268kh\") pod \"multus-admission-controller-857f4d67dd-xfgn4\" (UID: \"993eca69-6343-4f10-95d0-7f2def6430d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912068 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912097 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snsjk\" (UniqueName: \"kubernetes.io/projected/6bb10ff4-21a1-499c-a25f-ce649548a3b4-kube-api-access-snsjk\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912113 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912153 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79437aa6-d273-4649-ac1c-e8955b940576-service-ca-bundle\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912172 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-metrics-certs\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912222 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7sx9\" (UniqueName: \"kubernetes.io/projected/d488e843-a3e6-48fd-ab56-98836840aa40-kube-api-access-t7sx9\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912239 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6nc9\" (UniqueName: \"kubernetes.io/projected/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-kube-api-access-j6nc9\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912278 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4khr\" (UniqueName: \"kubernetes.io/projected/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-kube-api-access-c4khr\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912295 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912336 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db498458-18d4-4142-b536-3141889616e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912366 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b676f291-aafa-48d4-a5ab-0943b69b7be4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912592 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-csi-data-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912735 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-bound-sa-token\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912760 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/efafcd75-fae8-4f2d-8721-f166037532c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912791 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3c193453-0ab4-4e10-8897-439f6a833645-node-bootstrap-token\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.912809 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3c193453-0ab4-4e10-8897-439f6a833645-certs\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.913760 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-config-volume\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.915478 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.916185 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-registry-certificates\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.916606 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-trusted-ca\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.920782 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db498458-18d4-4142-b536-3141889616e1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.927903 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-registry-tls\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.929518 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db498458-18d4-4142-b536-3141889616e1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.979265 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w95sg\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-kube-api-access-w95sg\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:36 crc kubenswrapper[4849]: I0320 13:27:36.979581 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-bound-sa-token\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014443 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/993eca69-6343-4f10-95d0-7f2def6430d6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xfgn4\" (UID: \"993eca69-6343-4f10-95d0-7f2def6430d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014484 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-stats-auth\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014507 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-config\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014524 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-mountpoint-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014548 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b676f291-aafa-48d4-a5ab-0943b69b7be4-srv-cert\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014564 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-default-certificate\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014578 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-plugins-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014600 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4287\" (UniqueName: \"kubernetes.io/projected/efafcd75-fae8-4f2d-8721-f166037532c6-kube-api-access-l4287\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014620 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r9r9\" (UniqueName: \"kubernetes.io/projected/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-kube-api-access-4r9r9\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014638 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgk8q\" (UniqueName: \"kubernetes.io/projected/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-kube-api-access-rgk8q\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014653 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014670 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-config\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014685 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d488e843-a3e6-48fd-ab56-98836840aa40-tmpfs\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014700 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52rfl\" (UniqueName: \"kubernetes.io/projected/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-kube-api-access-52rfl\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014731 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dkj\" (UniqueName: \"kubernetes.io/projected/b676f291-aafa-48d4-a5ab-0943b69b7be4-kube-api-access-s8dkj\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014746 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6hz\" (UniqueName: \"kubernetes.io/projected/79437aa6-d273-4649-ac1c-e8955b940576-kube-api-access-pv6hz\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014770 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d488e843-a3e6-48fd-ab56-98836840aa40-apiservice-cert\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014785 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-268kh\" (UniqueName: \"kubernetes.io/projected/993eca69-6343-4f10-95d0-7f2def6430d6-kube-api-access-268kh\") pod \"multus-admission-controller-857f4d67dd-xfgn4\" (UID: \"993eca69-6343-4f10-95d0-7f2def6430d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014805 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014835 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snsjk\" (UniqueName: \"kubernetes.io/projected/6bb10ff4-21a1-499c-a25f-ce649548a3b4-kube-api-access-snsjk\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014852 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014868 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79437aa6-d273-4649-ac1c-e8955b940576-service-ca-bundle\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014883 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-metrics-certs\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014906 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7sx9\" (UniqueName: \"kubernetes.io/projected/d488e843-a3e6-48fd-ab56-98836840aa40-kube-api-access-t7sx9\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014923 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6nc9\" (UniqueName: \"kubernetes.io/projected/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-kube-api-access-j6nc9\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014945 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4khr\" (UniqueName: \"kubernetes.io/projected/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-kube-api-access-c4khr\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014960 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014977 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b676f291-aafa-48d4-a5ab-0943b69b7be4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.014993 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-csi-data-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015011 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/efafcd75-fae8-4f2d-8721-f166037532c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015028 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3c193453-0ab4-4e10-8897-439f6a833645-node-bootstrap-token\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015045 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3c193453-0ab4-4e10-8897-439f6a833645-certs\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015064 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-config-volume\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015080 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-metrics-tls\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015096 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-secret-volume\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015110 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4d96b99-5115-44da-92cd-e05b9450b4f0-cert\") pod \"ingress-canary-r8cpv\" (UID: \"e4d96b99-5115-44da-92cd-e05b9450b4f0\") " pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015132 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bb10ff4-21a1-499c-a25f-ce649548a3b4-srv-cert\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015148 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmj4\" (UniqueName: \"kubernetes.io/projected/4855b8cf-a062-487c-bf23-49fd7f919e7a-kube-api-access-4wmj4\") pod \"auto-csr-approver-29566886-7cjjt\" (UID: \"4855b8cf-a062-487c-bf23-49fd7f919e7a\") " pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015163 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzfk\" (UniqueName: \"kubernetes.io/projected/e4d96b99-5115-44da-92cd-e05b9450b4f0-kube-api-access-mqzfk\") pod \"ingress-canary-r8cpv\" (UID: \"e4d96b99-5115-44da-92cd-e05b9450b4f0\") " pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015179 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bb10ff4-21a1-499c-a25f-ce649548a3b4-profile-collector-cert\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015194 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/efafcd75-fae8-4f2d-8721-f166037532c6-signing-key\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015211 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d488e843-a3e6-48fd-ab56-98836840aa40-webhook-cert\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015228 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-socket-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015241 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-registration-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015258 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a815dfc-2fe5-4207-82a4-fa8e0b237411-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ccw9b\" (UID: \"4a815dfc-2fe5-4207-82a4-fa8e0b237411\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015279 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015296 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs2rs\" (UniqueName: \"kubernetes.io/projected/3c193453-0ab4-4e10-8897-439f6a833645-kube-api-access-cs2rs\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015310 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-serving-cert\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015324 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-config-volume\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.015339 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjtpj\" (UniqueName: \"kubernetes.io/projected/4a815dfc-2fe5-4207-82a4-fa8e0b237411-kube-api-access-zjtpj\") pod \"package-server-manager-789f6589d5-ccw9b\" (UID: \"4a815dfc-2fe5-4207-82a4-fa8e0b237411\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.017311 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-config\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.017648 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d488e843-a3e6-48fd-ab56-98836840aa40-tmpfs\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.022320 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d488e843-a3e6-48fd-ab56-98836840aa40-apiservice-cert\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.023781 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d488e843-a3e6-48fd-ab56-98836840aa40-webhook-cert\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.023894 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79437aa6-d273-4649-ac1c-e8955b940576-service-ca-bundle\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.026436 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-mountpoint-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.027049 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-config\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.027192 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-csi-data-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.027645 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b676f291-aafa-48d4-a5ab-0943b69b7be4-srv-cert\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.028145 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.528130162 +0000 UTC m=+207.205853557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.028342 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-registration-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.028615 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.028659 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-socket-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.028732 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-config-volume\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.028898 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-plugins-dir\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.029200 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/993eca69-6343-4f10-95d0-7f2def6430d6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xfgn4\" (UID: \"993eca69-6343-4f10-95d0-7f2def6430d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.029621 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-secret-volume\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.031352 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.031729 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-config-volume\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.032140 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/efafcd75-fae8-4f2d-8721-f166037532c6-signing-cabundle\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.032651 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-metrics-certs\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.033038 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/efafcd75-fae8-4f2d-8721-f166037532c6-signing-key\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.036272 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a815dfc-2fe5-4207-82a4-fa8e0b237411-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ccw9b\" (UID: \"4a815dfc-2fe5-4207-82a4-fa8e0b237411\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.045325 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3c193453-0ab4-4e10-8897-439f6a833645-node-bootstrap-token\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.045642 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.047602 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4d96b99-5115-44da-92cd-e05b9450b4f0-cert\") pod \"ingress-canary-r8cpv\" (UID: \"e4d96b99-5115-44da-92cd-e05b9450b4f0\") " pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.048529 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-stats-auth\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.048569 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-metrics-tls\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.048676 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3c193453-0ab4-4e10-8897-439f6a833645-certs\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.049139 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/79437aa6-d273-4649-ac1c-e8955b940576-default-certificate\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.049742 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6bb10ff4-21a1-499c-a25f-ce649548a3b4-srv-cert\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.050031 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b676f291-aafa-48d4-a5ab-0943b69b7be4-profile-collector-cert\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.050403 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-serving-cert\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.058226 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6bb10ff4-21a1-499c-a25f-ce649548a3b4-profile-collector-cert\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.078765 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ztzl5" event={"ID":"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537","Type":"ContainerStarted","Data":"1026adec484469cd5dd09bb7b2c530d14fe126bee7efc4949e17180971c4c608"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.078851 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ztzl5" event={"ID":"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537","Type":"ContainerStarted","Data":"5da13af49286494d17e3f05668954aa773d596e0dc25776a32d0f00c8ad64a9f"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.081605 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" event={"ID":"3071ec75-8957-46ad-8604-eaccf482cf02","Type":"ContainerStarted","Data":"b7248f4cf0e0a77d879759f9b910a50201931807bdacf38194eff15e0dd489c4"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.083287 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g4582" event={"ID":"acf1aff9-e595-444a-965c-a95c02348257","Type":"ContainerStarted","Data":"9f6a3fb286a22b664b5af0ee377f0c8435ff0343d759fe9c4ed214f2fda740fd"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.083318 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-g4582" event={"ID":"acf1aff9-e595-444a-965c-a95c02348257","Type":"ContainerStarted","Data":"65c6707852af59c1369aa4548cd5470dd9e2c88a67f4168e35aeb7cb1f53b085"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.083797 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjtpj\" (UniqueName: \"kubernetes.io/projected/4a815dfc-2fe5-4207-82a4-fa8e0b237411-kube-api-access-zjtpj\") pod \"package-server-manager-789f6589d5-ccw9b\" (UID: \"4a815dfc-2fe5-4207-82a4-fa8e0b237411\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.090382 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" event={"ID":"cd2a2c36-750c-426a-acfa-7359c0719805","Type":"ContainerStarted","Data":"ef37234b3a6271ebce0cd0d7a2a57b89049f28046a00fe868694ef4ad8feed2b"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.099491 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" event={"ID":"db727f58-5ed2-4e4f-88d1-5df962353c84","Type":"ContainerStarted","Data":"7809582858ba5e8be4b697c692e9281084c994aa207b6afd1da3659e5ec85afb"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.102418 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" event={"ID":"d7b51068-69a5-456b-8594-202190bd605e","Type":"ContainerStarted","Data":"2168566e20bb7f842581b89fc23d432d567d5c465c31dba4901debfff03d81cb"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.102735 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" event={"ID":"d7b51068-69a5-456b-8594-202190bd605e","Type":"ContainerStarted","Data":"d494082da0c76e680e72dba345d451229b975f4ad72c06da65f7b8fefc411606"} Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.103475 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-268kh\" (UniqueName: \"kubernetes.io/projected/993eca69-6343-4f10-95d0-7f2def6430d6-kube-api-access-268kh\") pod \"multus-admission-controller-857f4d67dd-xfgn4\" (UID: \"993eca69-6343-4f10-95d0-7f2def6430d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.105248 4849 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d5hbn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.105284 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.114020 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/eaf4434b-fea3-4332-a35f-e9e4f785b1b0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-j6v5d\" (UID: \"eaf4434b-fea3-4332-a35f-e9e4f785b1b0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.116042 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.117374 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.61734823 +0000 UTC m=+207.295071695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.125205 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf"] Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.140638 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snsjk\" (UniqueName: \"kubernetes.io/projected/6bb10ff4-21a1-499c-a25f-ce649548a3b4-kube-api-access-snsjk\") pod \"catalog-operator-68c6474976-xrw8n\" (UID: \"6bb10ff4-21a1-499c-a25f-ce649548a3b4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.179424 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dkj\" (UniqueName: \"kubernetes.io/projected/b676f291-aafa-48d4-a5ab-0943b69b7be4-kube-api-access-s8dkj\") pod \"olm-operator-6b444d44fb-bj5n2\" (UID: \"b676f291-aafa-48d4-a5ab-0943b69b7be4\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.183490 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52rfl\" (UniqueName: \"kubernetes.io/projected/6dabdf55-eae8-4c83-967c-58ebcc1d4a73-kube-api-access-52rfl\") pod \"dns-default-pmc7z\" (UID: \"6dabdf55-eae8-4c83-967c-58ebcc1d4a73\") " pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.199199 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6hz\" (UniqueName: \"kubernetes.io/projected/79437aa6-d273-4649-ac1c-e8955b940576-kube-api-access-pv6hz\") pod \"router-default-5444994796-n5ktf\" (UID: \"79437aa6-d273-4649-ac1c-e8955b940576\") " pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.221731 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.233326 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzfk\" (UniqueName: \"kubernetes.io/projected/e4d96b99-5115-44da-92cd-e05b9450b4f0-kube-api-access-mqzfk\") pod \"ingress-canary-r8cpv\" (UID: \"e4d96b99-5115-44da-92cd-e05b9450b4f0\") " pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.236168 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.736116462 +0000 UTC m=+207.413839867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.240525 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmj4\" (UniqueName: \"kubernetes.io/projected/4855b8cf-a062-487c-bf23-49fd7f919e7a-kube-api-access-4wmj4\") pod \"auto-csr-approver-29566886-7cjjt\" (UID: \"4855b8cf-a062-487c-bf23-49fd7f919e7a\") " pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.266527 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.271836 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7sx9\" (UniqueName: \"kubernetes.io/projected/d488e843-a3e6-48fd-ab56-98836840aa40-kube-api-access-t7sx9\") pod \"packageserver-d55dfcdfc-s2ph5\" (UID: \"d488e843-a3e6-48fd-ab56-98836840aa40\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.272089 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.296770 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h75cn"] Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.297782 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6nc9\" (UniqueName: \"kubernetes.io/projected/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-kube-api-access-j6nc9\") pod \"marketplace-operator-79b997595-v8tw5\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.310135 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4khr\" (UniqueName: \"kubernetes.io/projected/3c1a4d40-f2ed-4e90-a235-5cbc372ade36-kube-api-access-c4khr\") pod \"csi-hostpathplugin-7b6nh\" (UID: \"3c1a4d40-f2ed-4e90-a235-5cbc372ade36\") " pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.319633 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.325472 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.326007 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.825987699 +0000 UTC m=+207.503711094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.326125 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.335070 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.342974 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4287\" (UniqueName: \"kubernetes.io/projected/efafcd75-fae8-4f2d-8721-f166037532c6-kube-api-access-l4287\") pod \"service-ca-9c57cc56f-5747m\" (UID: \"efafcd75-fae8-4f2d-8721-f166037532c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.343232 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.349626 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs2rs\" (UniqueName: \"kubernetes.io/projected/3c193453-0ab4-4e10-8897-439f6a833645-kube-api-access-cs2rs\") pod \"machine-config-server-9zhrn\" (UID: \"3c193453-0ab4-4e10-8897-439f6a833645\") " pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.354091 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5747m" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.377128 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.382512 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.390311 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgk8q\" (UniqueName: \"kubernetes.io/projected/fdb559a4-cdb3-44d3-9910-f49f1de8d68e-kube-api-access-rgk8q\") pod \"service-ca-operator-777779d784-4t5wj\" (UID: \"fdb559a4-cdb3-44d3-9910-f49f1de8d68e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.390326 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r9r9\" (UniqueName: \"kubernetes.io/projected/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-kube-api-access-4r9r9\") pod \"collect-profiles-29566875-jr47q\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.390973 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.395396 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mwgqm"] Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.396963 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r8cpv" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.401244 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.412193 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vrpx2"] Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.412435 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.427927 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.428275 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:37.928261596 +0000 UTC m=+207.605984991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.437248 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-9zhrn" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.483233 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-568bd"] Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.530404 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.530572 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.030547354 +0000 UTC m=+207.708270749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.530628 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.541882 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.041858376 +0000 UTC m=+207.719581771 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.632444 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.632683 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.132649758 +0000 UTC m=+207.810373153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.633020 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.633405 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.133389568 +0000 UTC m=+207.811112963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.667430 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.668194 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.737233 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.737433 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.237400614 +0000 UTC m=+207.915124019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.737477 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.737891 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.237877637 +0000 UTC m=+207.915601042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: W0320 13:27:37.813944 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e134c2_39cf_49e0_949e_b43e2de6eda3.slice/crio-855067220a0ee77a0d0ecb92b0104e20a3a2cede4858ffe67edbf37b6eb771e1 WatchSource:0}: Error finding container 855067220a0ee77a0d0ecb92b0104e20a3a2cede4858ffe67edbf37b6eb771e1: Status 404 returned error can't find the container with id 855067220a0ee77a0d0ecb92b0104e20a3a2cede4858ffe67edbf37b6eb771e1 Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.838210 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.838625 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.338607952 +0000 UTC m=+208.016331347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.939702 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.950062 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:37 crc kubenswrapper[4849]: E0320 13:27:37.953173 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.453126507 +0000 UTC m=+208.130849902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:37 crc kubenswrapper[4849]: I0320 13:27:37.981386 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47558: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.057253 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hcx8v" podStartSLOduration=155.057229005 podStartE2EDuration="2m35.057229005s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.057196734 +0000 UTC m=+207.734920119" watchObservedRunningTime="2026-03-20 13:27:38.057229005 +0000 UTC m=+207.734952410" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.057708 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.058418 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.558401468 +0000 UTC m=+208.236124863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.060668 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47566: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.136699 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" event={"ID":"db727f58-5ed2-4e4f-88d1-5df962353c84","Type":"ContainerStarted","Data":"a1c666b7250affada662314987112150541ff5f002546a56002e99e61f3171b2"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.137708 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" event={"ID":"db727f58-5ed2-4e4f-88d1-5df962353c84","Type":"ContainerStarted","Data":"ba260a56697abc78aec682176ad0bde2a6c38899d29b7ec37b6d3b6680deac9b"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.139048 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwgqm" event={"ID":"e0a6353b-f7df-4ef2-b5c0-e52f35646aba","Type":"ContainerStarted","Data":"decffda26f9b87edf2be598ec2ed4c31804ff64e08beb0b6a485386db2de19be"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.151147 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" event={"ID":"08e134c2-39cf-49e0-949e-b43e2de6eda3","Type":"ContainerStarted","Data":"855067220a0ee77a0d0ecb92b0104e20a3a2cede4858ffe67edbf37b6eb771e1"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.157791 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9zhrn" event={"ID":"3c193453-0ab4-4e10-8897-439f6a833645","Type":"ContainerStarted","Data":"bd2431a7814dc6db7fdcb98aa729ad161c6d27d79e784e9cebc12cec06582b4a"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.158933 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.159261 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.659250966 +0000 UTC m=+208.336974361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.168065 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" event={"ID":"f4f2310e-2f2f-4d0b-97a7-3b740a881646","Type":"ContainerStarted","Data":"036a38e4287de4d5c7d449bed9960eebd7c66758d92bbbedfe761eab1fdc3378"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.169188 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" event={"ID":"7f3b98b0-5152-4678-bf63-a92ab6759fd4","Type":"ContainerStarted","Data":"e81bb300a8ea6b2feae4b79d83de001f4f86e7759042b372ebadffe2e5656a9f"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.172861 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n5ktf" event={"ID":"79437aa6-d273-4649-ac1c-e8955b940576","Type":"ContainerStarted","Data":"f34771fcd8a732cf63135a2d38afeae478d1fbb8817d1494d79f85314c880b50"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.173582 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47572: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.178904 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" event={"ID":"fb670ffd-f8c9-4003-bf91-b4e36c7f1292","Type":"ContainerStarted","Data":"8c69336cf23e841a19270483b72bb9889f02ad07584f3bd62330ba2123872cb7"} Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.202573 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfr7p"] Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.205196 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs"] Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.253503 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4"] Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.259859 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.261158 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.761142883 +0000 UTC m=+208.438866278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.264846 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47580: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.291149 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" podStartSLOduration=154.29113347 podStartE2EDuration="2m34.29113347s" podCreationTimestamp="2026-03-20 13:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.258781838 +0000 UTC m=+207.936505263" watchObservedRunningTime="2026-03-20 13:27:38.29113347 +0000 UTC m=+207.968856865" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.335094 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" podStartSLOduration=154.33507363 podStartE2EDuration="2m34.33507363s" podCreationTimestamp="2026-03-20 13:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.289791103 +0000 UTC m=+207.967514508" watchObservedRunningTime="2026-03-20 13:27:38.33507363 +0000 UTC m=+208.012797035" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.361591 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.361983 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.861967041 +0000 UTC m=+208.539690436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.372491 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47590: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.462165 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.462336 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.962309806 +0000 UTC m=+208.640033211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.462455 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.462741 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:38.962727667 +0000 UTC m=+208.640451062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.498581 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-chdbf" podStartSLOduration=155.498566335 podStartE2EDuration="2m35.498566335s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.497469324 +0000 UTC m=+208.175192719" watchObservedRunningTime="2026-03-20 13:27:38.498566335 +0000 UTC m=+208.176289730" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.548289 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47596: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.563169 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.563524 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.063510234 +0000 UTC m=+208.741233629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.630328 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47612: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.651279 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-g4582" podStartSLOduration=155.651259262 podStartE2EDuration="2m35.651259262s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.649561815 +0000 UTC m=+208.327285230" watchObservedRunningTime="2026-03-20 13:27:38.651259262 +0000 UTC m=+208.328982657" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.664535 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.664875 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.164862236 +0000 UTC m=+208.842585631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.677462 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" podStartSLOduration=155.677446603 podStartE2EDuration="2m35.677446603s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.67662087 +0000 UTC m=+208.354344275" watchObservedRunningTime="2026-03-20 13:27:38.677446603 +0000 UTC m=+208.355170008" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.765182 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.765346 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.265299123 +0000 UTC m=+208.943022518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.765495 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.765780 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.265769756 +0000 UTC m=+208.943493151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.787576 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ztzl5" podStartSLOduration=155.787549776 podStartE2EDuration="2m35.787549776s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:38.782192689 +0000 UTC m=+208.459916084" watchObservedRunningTime="2026-03-20 13:27:38.787549776 +0000 UTC m=+208.465273171" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.826418 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47626: no serving certificate available for the kubelet" Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.869336 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.869650 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.369635598 +0000 UTC m=+209.047358993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:38 crc kubenswrapper[4849]: I0320 13:27:38.971103 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:38 crc kubenswrapper[4849]: E0320 13:27:38.971504 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.471490484 +0000 UTC m=+209.149213869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.025218 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8tznr" podStartSLOduration=156.025203184 podStartE2EDuration="2m36.025203184s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.025180694 +0000 UTC m=+208.702904089" watchObservedRunningTime="2026-03-20 13:27:39.025203184 +0000 UTC m=+208.702926579" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.072635 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.074509 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.574493822 +0000 UTC m=+209.252217217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.136533 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.175188 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.175464 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.675453014 +0000 UTC m=+209.353176409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: W0320 13:27:39.218091 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec0bd3b_8a73_446d_8fb6_53c537db79f0.slice/crio-deb8f66e9b3b744d372899b50a01eae087d84ed165938ac74caba96e66601431 WatchSource:0}: Error finding container deb8f66e9b3b744d372899b50a01eae087d84ed165938ac74caba96e66601431: Status 404 returned error can't find the container with id deb8f66e9b3b744d372899b50a01eae087d84ed165938ac74caba96e66601431 Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.222304 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tb9sj" podStartSLOduration=156.222281264 podStartE2EDuration="2m36.222281264s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.217987566 +0000 UTC m=+208.895710961" watchObservedRunningTime="2026-03-20 13:27:39.222281264 +0000 UTC m=+208.900004679" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.229690 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9lhk8"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.233222 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-n5ktf" event={"ID":"79437aa6-d273-4649-ac1c-e8955b940576","Type":"ContainerStarted","Data":"b3d86d605c5e70589614b3462be885297b38486699d784e3c4c351dda4a9119f"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.267893 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.268483 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" event={"ID":"fb670ffd-f8c9-4003-bf91-b4e36c7f1292","Type":"ContainerStarted","Data":"efe9eeebe4179e4ccb403ca20f41b34cdac63d745919ff7982c23952483be47e"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.268602 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" event={"ID":"fb670ffd-f8c9-4003-bf91-b4e36c7f1292","Type":"ContainerStarted","Data":"1cfb33fd50571676adbff83f306b35f4820187772775987a0e8ce9888feed8b8"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.274611 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.275018 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.275131 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.283695 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.283862 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.783811839 +0000 UTC m=+209.461535234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.288084 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.289428 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.789416144 +0000 UTC m=+209.467139529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.311427 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-n5ktf" podStartSLOduration=156.31140856 podStartE2EDuration="2m36.31140856s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.283225173 +0000 UTC m=+208.960948578" watchObservedRunningTime="2026-03-20 13:27:39.31140856 +0000 UTC m=+208.989131955" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.321792 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.322065 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwgqm" event={"ID":"e0a6353b-f7df-4ef2-b5c0-e52f35646aba","Type":"ContainerStarted","Data":"aaa54ce9016d8d52ca0588e79fd770dcff25f03ed4a79719df427a318d4e1e2f"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.330314 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.341940 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k4s2s"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.343231 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-568bd" podStartSLOduration=156.343213906 podStartE2EDuration="2m36.343213906s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.333367964 +0000 UTC m=+209.011091369" watchObservedRunningTime="2026-03-20 13:27:39.343213906 +0000 UTC m=+209.020937301" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.355404 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.355447 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.377147 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" event={"ID":"f1f2af94-ce72-498b-a231-d171ab0e8760","Type":"ContainerStarted","Data":"7f6a81a1ea195eb380f609a8d85d21420d7c6a0122f97481aeafeededf2645d1"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.390000 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mwgqm" podStartSLOduration=156.389983685 podStartE2EDuration="2m36.389983685s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.389614574 +0000 UTC m=+209.067337969" watchObservedRunningTime="2026-03-20 13:27:39.389983685 +0000 UTC m=+209.067707080" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.393054 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.393344 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.393370 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.893319076 +0000 UTC m=+209.571042471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.393394 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.393605 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.393916 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.893909443 +0000 UTC m=+209.571632838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.406516 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.443931 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.467609 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" event={"ID":"08e134c2-39cf-49e0-949e-b43e2de6eda3","Type":"ContainerStarted","Data":"51fb89db011d9e875225445137eb9b7f404a563d0996b5ce0de2807e42fc4fe9"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.468842 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.493306 4849 patch_prober.go:28] interesting pod/console-operator-58897d9998-vrpx2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.493353 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" podUID="08e134c2-39cf-49e0-949e-b43e2de6eda3" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.495174 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.495760 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:39.995746547 +0000 UTC m=+209.673469942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.537553 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" podStartSLOduration=156.537534409 podStartE2EDuration="2m36.537534409s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.520101988 +0000 UTC m=+209.197825393" watchObservedRunningTime="2026-03-20 13:27:39.537534409 +0000 UTC m=+209.215257804" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.537745 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.561124 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" event={"ID":"7f3b98b0-5152-4678-bf63-a92ab6759fd4","Type":"ContainerStarted","Data":"5f1c5544d8b8f3cc2bf4ca5144ec0c7beddb4ebcd8a97c076fdc7936c3e68659"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.568362 4849 ???:1] "http: TLS handshake error from 192.168.126.11:47630: no serving certificate available for the kubelet" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.603443 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.607447 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.107432964 +0000 UTC m=+209.785156359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.613468 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.613635 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" event={"ID":"4ce29d69-5989-485a-9da1-3db91f1030fd","Type":"ContainerStarted","Data":"a1d88c9d511c8545e2301ac43d6f6ce0e1ccb71a88dbc816ac36fe314a120e1c"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.613656 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" event={"ID":"4ce29d69-5989-485a-9da1-3db91f1030fd","Type":"ContainerStarted","Data":"a3b84128f37b104ccbab2395056b6698866b087a3a0d86b0dc3a27130b718efb"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.641881 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" event={"ID":"b7186050-ada2-4a8a-9d0d-c7059bc85a6e","Type":"ContainerStarted","Data":"6bee87dacc9df840a08139063b25495cddcf4a33dd5565744d70f50379f3bf02"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.641941 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" event={"ID":"b7186050-ada2-4a8a-9d0d-c7059bc85a6e","Type":"ContainerStarted","Data":"2ab600a73ff43671fcc40a26ece72cba8fbcccfabc86c27050300d304f3ce9fa"} Mar 20 13:27:39 crc kubenswrapper[4849]: W0320 13:27:39.642349 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf4434b_fea3_4332_a35f_e9e4f785b1b0.slice/crio-b8672c343cb54f91d90266e08d2e53d0471274dce9c313e7c38631591e823e0c WatchSource:0}: Error finding container b8672c343cb54f91d90266e08d2e53d0471274dce9c313e7c38631591e823e0c: Status 404 returned error can't find the container with id b8672c343cb54f91d90266e08d2e53d0471274dce9c313e7c38631591e823e0c Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.645501 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.650665 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.652119 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-9zhrn" event={"ID":"3c193453-0ab4-4e10-8897-439f6a833645","Type":"ContainerStarted","Data":"d4e6f5eb1dba0c9d89dabe43e84f450cad55e56eae75a65f9ffbc720cb8b4a9a"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.653396 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.655163 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.680330 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" event={"ID":"f4f2310e-2f2f-4d0b-97a7-3b740a881646","Type":"ContainerStarted","Data":"775e2f1216b9296fe61204d0e671d1a771fce157487a1a784f95f98ecbf5db6d"} Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.705344 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.706957 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.206941286 +0000 UTC m=+209.884664681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.748386 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-572xs" podStartSLOduration=156.748356757 podStartE2EDuration="2m36.748356757s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.746596198 +0000 UTC m=+209.424319593" watchObservedRunningTime="2026-03-20 13:27:39.748356757 +0000 UTC m=+209.426080152" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.750365 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5747m"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.811814 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.812195 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.312181295 +0000 UTC m=+209.989904690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.876025 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.902728 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.912369 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:39 crc kubenswrapper[4849]: E0320 13:27:39.913362 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.413343283 +0000 UTC m=+210.091066678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.925099 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pmc7z"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.925159 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-7cjjt"] Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.942152 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-9zhrn" podStartSLOduration=5.942130226 podStartE2EDuration="5.942130226s" podCreationTimestamp="2026-03-20 13:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:39.91471084 +0000 UTC m=+209.592434255" watchObservedRunningTime="2026-03-20 13:27:39.942130226 +0000 UTC m=+209.619853621" Mar 20 13:27:39 crc kubenswrapper[4849]: I0320 13:27:39.976892 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n"] Mar 20 13:27:39 crc kubenswrapper[4849]: W0320 13:27:39.978166 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a815dfc_2fe5_4207_82a4_fa8e0b237411.slice/crio-3df1023c79056a8d477751277813bd0bb1fbeb973c5e80c6ff075429bca9ac1c WatchSource:0}: Error finding container 3df1023c79056a8d477751277813bd0bb1fbeb973c5e80c6ff075429bca9ac1c: Status 404 returned error can't find the container with id 3df1023c79056a8d477751277813bd0bb1fbeb973c5e80c6ff075429bca9ac1c Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.001259 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2"] Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.017523 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.018097 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.518086068 +0000 UTC m=+210.195809463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.022438 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r8cpv"] Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.049890 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v8tw5"] Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.061017 4849 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.124769 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.125132 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.625117938 +0000 UTC m=+210.302841333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.136489 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7b6nh"] Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.193457 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj"] Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.227712 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.228084 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.728069084 +0000 UTC m=+210.405792479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.240230 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xfgn4"] Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.285775 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:40 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:40 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:40 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.285836 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.329245 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.329536 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.829511728 +0000 UTC m=+210.507235143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.343366 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.344431 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.376517 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.377508 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.398357 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.431161 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.433123 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:40.933108953 +0000 UTC m=+210.610832348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.540741 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.541132 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.041115978 +0000 UTC m=+210.718839373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.648039 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.648373 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.148361413 +0000 UTC m=+210.826084808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.751087 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.751932 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.251916886 +0000 UTC m=+210.929640281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.759295 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" event={"ID":"993eca69-6343-4f10-95d0-7f2def6430d6","Type":"ContainerStarted","Data":"5a69675666bee2002fcbc209b5e218590a61354366f0fab46648be3d5b14ed1b"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.794782 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" event={"ID":"3c1a4d40-f2ed-4e90-a235-5cbc372ade36","Type":"ContainerStarted","Data":"0907dca588ec8b910cef723cab5e1caffe23c31b6a168c78720418f6890f4edd"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.795702 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" event={"ID":"b676f291-aafa-48d4-a5ab-0943b69b7be4","Type":"ContainerStarted","Data":"fb99ad5c5b645231975f8e2578e0877caa09e8cbee1a8bea2ec83f8de2cfacd8"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.800384 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" event={"ID":"b606bf18-c941-4fe2-9edf-8e4bf69bdc68","Type":"ContainerStarted","Data":"efaa028cac3913fed2dbf5c65cafa88c82304b4dfb2d9aded181af8972900d4e"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.813560 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" event={"ID":"4ce29d69-5989-485a-9da1-3db91f1030fd","Type":"ContainerStarted","Data":"1d91050e829fba00a663a5b57221ef8e6281bd51f3a583c24914e7c2d67d10eb"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.823732 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" event={"ID":"f1f2af94-ce72-498b-a231-d171ab0e8760","Type":"ContainerStarted","Data":"4026a7972ffb2d250f7fe26dd8f15e0561ad74b2bbd6b02d7f983d7364910223"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.824072 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.825688 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" event={"ID":"d0b2328a-5711-41d8-b908-79389f55898e","Type":"ContainerStarted","Data":"5d03da0ec75cbc7f0e4f647a0b0f25f7a6f148c95f9de1e4aadc162a7ef449dc"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.833734 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" event={"ID":"bb92d3ea-ece6-4afc-ac78-3a35f9635095","Type":"ContainerStarted","Data":"1e8d8f7ad6d0a344c9fa57af2ac8173d1f727a336ede3bd7b6f3a1a026cd4ee5"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.833790 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" event={"ID":"bb92d3ea-ece6-4afc-ac78-3a35f9635095","Type":"ContainerStarted","Data":"f460f919bf37c51cdb22f17e287e766d682f69bff5048ec20ba263734e70de2a"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.834573 4849 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lfr7p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.834648 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" podUID="f1f2af94-ce72-498b-a231-d171ab0e8760" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.838184 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" event={"ID":"7b0d7c79-e28e-436d-be90-8633bef20e8f","Type":"ContainerStarted","Data":"b42aca5062a0e02454f29d4869c03ee16bf5af778fdd89ecd42e63c238395049"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.838218 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" event={"ID":"7b0d7c79-e28e-436d-be90-8633bef20e8f","Type":"ContainerStarted","Data":"1e6da4978d1b03d8143c58b6be03301a1200bc677cb7577b2e6b48420bb23ffc"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.841258 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ms4l4" podStartSLOduration=157.841247787 podStartE2EDuration="2m37.841247787s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:40.840948009 +0000 UTC m=+210.518671414" watchObservedRunningTime="2026-03-20 13:27:40.841247787 +0000 UTC m=+210.518971182" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.854746 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.855137 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.35512672 +0000 UTC m=+211.032850115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.891071 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" event={"ID":"f4f2310e-2f2f-4d0b-97a7-3b740a881646","Type":"ContainerStarted","Data":"e846a000150b87690c8ca563cd14143a98aef24a77e0739ff8ce63c04b0bada1"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.900095 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r8cpv" event={"ID":"e4d96b99-5115-44da-92cd-e05b9450b4f0","Type":"ContainerStarted","Data":"a768021287a03362d30e8be3446f32dd88124c13d9d7b08d3e5e537be07f1109"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.907733 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" event={"ID":"6bb10ff4-21a1-499c-a25f-ce649548a3b4","Type":"ContainerStarted","Data":"dde0165a36cd8f363cdb4e11cbd1a0b026e361b9e0817db403bd26f2e8ea8bc3"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.928605 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" event={"ID":"4a815dfc-2fe5-4207-82a4-fa8e0b237411","Type":"ContainerStarted","Data":"3df1023c79056a8d477751277813bd0bb1fbeb973c5e80c6ff075429bca9ac1c"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.929443 4849 ???:1] "http: TLS handshake error from 192.168.126.11:38406: no serving certificate available for the kubelet" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.929669 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-blwkb" podStartSLOduration=157.929641003 podStartE2EDuration="2m37.929641003s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:40.879161582 +0000 UTC m=+210.556884987" watchObservedRunningTime="2026-03-20 13:27:40.929641003 +0000 UTC m=+210.607364398" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.948618 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" event={"ID":"9556c251-d79c-4fec-a027-bf4aeb2fc4f1","Type":"ContainerStarted","Data":"811867ced1e6a808cd0b4acb373710b5a906e875e7ad9af30db72e1ef01619f3"} Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.956542 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.958411 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" podStartSLOduration=157.958395095 podStartE2EDuration="2m37.958395095s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:40.92807113 +0000 UTC m=+210.605794525" watchObservedRunningTime="2026-03-20 13:27:40.958395095 +0000 UTC m=+210.636118490" Mar 20 13:27:40 crc kubenswrapper[4849]: I0320 13:27:40.966055 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" event={"ID":"fdb559a4-cdb3-44d3-9910-f49f1de8d68e","Type":"ContainerStarted","Data":"21102a150a26aa383d598341793b729b43922b1be8502b6a48a533c291d0c576"} Mar 20 13:27:40 crc kubenswrapper[4849]: E0320 13:27:40.975246 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.475221789 +0000 UTC m=+211.152945184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.004755 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" event={"ID":"62c47fea-7805-489a-addc-f3e37eb30b7e","Type":"ContainerStarted","Data":"7f334c415c2652ae2ddef16672d9e54fd3511143c372dd4114afa54d92382a86"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.005016 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" event={"ID":"62c47fea-7805-489a-addc-f3e37eb30b7e","Type":"ContainerStarted","Data":"28127e125a4654c874b059851f96e3e6ed5c35a6c135e25e8ab566a670f4df9a"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.006773 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5747m" event={"ID":"efafcd75-fae8-4f2d-8721-f166037532c6","Type":"ContainerStarted","Data":"3a4a355bd5376ea8e9dd2f0d3951d828220b2b0ce142edf97054ceccbf5e778e"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.011070 4849 generic.go:334] "Generic (PLEG): container finished" podID="7f3b98b0-5152-4678-bf63-a92ab6759fd4" containerID="5f1c5544d8b8f3cc2bf4ca5144ec0c7beddb4ebcd8a97c076fdc7936c3e68659" exitCode=0 Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.011143 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" event={"ID":"7f3b98b0-5152-4678-bf63-a92ab6759fd4","Type":"ContainerDied","Data":"5f1c5544d8b8f3cc2bf4ca5144ec0c7beddb4ebcd8a97c076fdc7936c3e68659"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.011172 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" event={"ID":"7f3b98b0-5152-4678-bf63-a92ab6759fd4","Type":"ContainerStarted","Data":"be5317818ef36428b6e8132f51062402a6d6c737b0d22c8936416f4220b8dff7"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.011379 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.053070 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-qxbwf" podStartSLOduration=158.053053073 podStartE2EDuration="2m38.053053073s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:40.975057974 +0000 UTC m=+210.652781369" watchObservedRunningTime="2026-03-20 13:27:41.053053073 +0000 UTC m=+210.730776468" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.058936 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.060042 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.560031005 +0000 UTC m=+211.237754400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.116748 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" event={"ID":"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883","Type":"ContainerStarted","Data":"f6e4520364467036962b1f9ddee67a5a0db41c272631a95d927201cf234a0946"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.116800 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" event={"ID":"f9c6ce5d-e71c-4bcc-ad54-d74cb789b883","Type":"ContainerStarted","Data":"d647ad39e7052abbd162555aca862d1091d2c58577ede4651ab2779264794214"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.144152 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" podStartSLOduration=158.144132822 podStartE2EDuration="2m38.144132822s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:41.054322618 +0000 UTC m=+210.732046013" watchObservedRunningTime="2026-03-20 13:27:41.144132822 +0000 UTC m=+210.821856217" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.160000 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.161229 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.661214263 +0000 UTC m=+211.338937658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.191625 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" event={"ID":"69e921bc-e295-4ace-a807-8768ed476321","Type":"ContainerStarted","Data":"dac8445865fc4c94fe99f4835ed54013a035b28da85df87a8ac6c48077272472"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.191679 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" event={"ID":"69e921bc-e295-4ace-a807-8768ed476321","Type":"ContainerStarted","Data":"119fdbabbfde3133b9c9b7ccd6b15d451fed6522aacb68860a8a329107b61817"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.233605 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" event={"ID":"eaf4434b-fea3-4332-a35f-e9e4f785b1b0","Type":"ContainerStarted","Data":"b8672c343cb54f91d90266e08d2e53d0471274dce9c313e7c38631591e823e0c"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.261637 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.263121 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.76310126 +0000 UTC m=+211.440824655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.283419 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" event={"ID":"7ec0bd3b-8a73-446d-8fb6-53c537db79f0","Type":"ContainerStarted","Data":"e9be91e6a7ec3982bae076068ba603595c5b15392578593561029234b562f1f1"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.283454 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" event={"ID":"7ec0bd3b-8a73-446d-8fb6-53c537db79f0","Type":"ContainerStarted","Data":"deb8f66e9b3b744d372899b50a01eae087d84ed165938ac74caba96e66601431"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.293133 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:41 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:41 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:41 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.293260 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.321178 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" event={"ID":"9e5f2143-15f8-487f-96ab-0242615ed791","Type":"ContainerStarted","Data":"bfedb6c0c92eb99c4377aea4c8765346d0c2df34d2245d6bf0492542c3b4ed72"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.321226 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" event={"ID":"9e5f2143-15f8-487f-96ab-0242615ed791","Type":"ContainerStarted","Data":"0996700cebaac244cf5ac0bdad5e1ec496b768ca1eb72b4ccbacb9f9e70d3062"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.323002 4849 patch_prober.go:28] interesting pod/apiserver-76f77b778f-g4582 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]log ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]etcd ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/max-in-flight-filter ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 13:27:41 crc kubenswrapper[4849]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 13:27:41 crc kubenswrapper[4849]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-startinformers ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 13:27:41 crc kubenswrapper[4849]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 13:27:41 crc kubenswrapper[4849]: livez check failed Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.323067 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-g4582" podUID="acf1aff9-e595-444a-965c-a95c02348257" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.341755 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" event={"ID":"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd","Type":"ContainerStarted","Data":"5b804b9e6524b753e215edeab7d1b51910fefb9760f366b8df2285adb0b53b90"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.393702 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.395035 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pmc7z" event={"ID":"6dabdf55-eae8-4c83-967c-58ebcc1d4a73","Type":"ContainerStarted","Data":"4a25914604454d533ca4e09dd4b1b5cae7b2f552152b9361f0627dfa6cec451d"} Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.395491 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.895465137 +0000 UTC m=+211.573188532 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.397302 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" event={"ID":"d488e843-a3e6-48fd-ab56-98836840aa40","Type":"ContainerStarted","Data":"e5db064226626e373f37b6356373e5f404bf139bf6d8b68291a0d6fe96d45c99"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.397329 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" event={"ID":"d488e843-a3e6-48fd-ab56-98836840aa40","Type":"ContainerStarted","Data":"7731d0b97da279a749107b3d51547d9190095871ca981a561bca419f5d92fd27"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.397388 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.398277 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:41.898251804 +0000 UTC m=+211.575975199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.399913 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.400808 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" event={"ID":"4855b8cf-a062-487c-bf23-49fd7f919e7a","Type":"ContainerStarted","Data":"75dc0126dd246a9f33b87fae6fcd9a3ccf970b4e0d669cd5d2d9794f1712f4ec"} Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.410781 4849 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-s2ph5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.411149 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" podUID="d488e843-a3e6-48fd-ab56-98836840aa40" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.443650 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.443709 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.448349 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5dq64" Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.504193 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.508800 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.008775208 +0000 UTC m=+211.686498593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.611445 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.614230 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.114214274 +0000 UTC m=+211.791937679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.717491 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.718365 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.218348393 +0000 UTC m=+211.896071788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.820619 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.820971 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.32095418 +0000 UTC m=+211.998677575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.921973 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:41 crc kubenswrapper[4849]: E0320 13:27:41.922416 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.422397084 +0000 UTC m=+212.100120479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:41 crc kubenswrapper[4849]: I0320 13:27:41.949671 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vrpx2" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.030417 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.031092 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.531015567 +0000 UTC m=+212.208739032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.132193 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.132732 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.632713419 +0000 UTC m=+212.310436814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.134056 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5mkmd" podStartSLOduration=159.134037125 podStartE2EDuration="2m39.134037125s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.132977166 +0000 UTC m=+211.810700581" watchObservedRunningTime="2026-03-20 13:27:42.134037125 +0000 UTC m=+211.811760520" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.135059 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hv7bs" podStartSLOduration=159.135050383 podStartE2EDuration="2m39.135050383s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.057169238 +0000 UTC m=+211.734892633" watchObservedRunningTime="2026-03-20 13:27:42.135050383 +0000 UTC m=+211.812773778" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.189797 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" podStartSLOduration=159.189771521 podStartE2EDuration="2m39.189771521s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.181113842 +0000 UTC m=+211.858837247" watchObservedRunningTime="2026-03-20 13:27:42.189771521 +0000 UTC m=+211.867494916" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.234808 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.235152 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.73510476 +0000 UTC m=+212.412828155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.290243 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:42 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:42 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:42 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.290329 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.336887 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.337348 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.837333927 +0000 UTC m=+212.515057322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.346505 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ss65v" podStartSLOduration=159.346492709 podStartE2EDuration="2m39.346492709s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.336103723 +0000 UTC m=+212.013827138" watchObservedRunningTime="2026-03-20 13:27:42.346492709 +0000 UTC m=+212.024216104" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.438143 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.438518 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:42.938506454 +0000 UTC m=+212.616229849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.497250 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" event={"ID":"993eca69-6343-4f10-95d0-7f2def6430d6","Type":"ContainerStarted","Data":"9314050258a320c5d81a88028a1cbc65ee1b3dcb196769f413a1c3d4cc0c5698"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.535727 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" event={"ID":"fdb559a4-cdb3-44d3-9910-f49f1de8d68e","Type":"ContainerStarted","Data":"ac8d2cc4e4307421fa9489c2b695ef4597f4bf8d3f61840348b945747856b23b"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.539933 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.540425 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.040409821 +0000 UTC m=+212.718133216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.548896 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" event={"ID":"62c47fea-7805-489a-addc-f3e37eb30b7e","Type":"ContainerStarted","Data":"c24e0c64373595c80e4042adabef77c939271b4f7662d6e4a580ff55dc5cdac3"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.562655 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" podStartSLOduration=159.562635064 podStartE2EDuration="2m39.562635064s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.376324641 +0000 UTC m=+212.054048046" watchObservedRunningTime="2026-03-20 13:27:42.562635064 +0000 UTC m=+212.240358479" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.564422 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4t5wj" podStartSLOduration=158.564417083 podStartE2EDuration="2m38.564417083s" podCreationTimestamp="2026-03-20 13:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.560214677 +0000 UTC m=+212.237938072" watchObservedRunningTime="2026-03-20 13:27:42.564417083 +0000 UTC m=+212.242140478" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.571364 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" event={"ID":"b676f291-aafa-48d4-a5ab-0943b69b7be4","Type":"ContainerStarted","Data":"9271e2b48ffd0f7fe01f035fff959dfe8fb39fea27126e2f60538e579e1111a9"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.572476 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.575636 4849 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-bj5n2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.575738 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" podUID="b676f291-aafa-48d4-a5ab-0943b69b7be4" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.580760 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5j2ck" podStartSLOduration=159.580743713 podStartE2EDuration="2m39.580743713s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.579501928 +0000 UTC m=+212.257225323" watchObservedRunningTime="2026-03-20 13:27:42.580743713 +0000 UTC m=+212.258467108" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.601110 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" event={"ID":"eaf4434b-fea3-4332-a35f-e9e4f785b1b0","Type":"ContainerStarted","Data":"23e4a2bcecd0a1620f899cb6926edf47fc1a4b72c19ae82fdde1e47ead9d3813"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.630297 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" event={"ID":"7b0d7c79-e28e-436d-be90-8633bef20e8f","Type":"ContainerStarted","Data":"2fa3008c4e54eefc9c00d66a51b339468845e242a1ddaadd18e1c26a9fc86509"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.642699 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.643415 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.143397289 +0000 UTC m=+212.821120744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.648165 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" event={"ID":"4a815dfc-2fe5-4207-82a4-fa8e0b237411","Type":"ContainerStarted","Data":"8408bdfb38aaa9f1a9f80b48ef65afc6f4002b8a6df56cd8ceb2472f1980981e"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.648324 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" event={"ID":"4a815dfc-2fe5-4207-82a4-fa8e0b237411","Type":"ContainerStarted","Data":"4b5690701b727bab5eb964fdc4d5285133f9ff0d776d71e24c6f3f6b39a04dd3"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.648903 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.649684 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" podStartSLOduration=159.649675082 podStartE2EDuration="2m39.649675082s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.616450786 +0000 UTC m=+212.294174181" watchObservedRunningTime="2026-03-20 13:27:42.649675082 +0000 UTC m=+212.327398477" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.673968 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5747m" event={"ID":"efafcd75-fae8-4f2d-8721-f166037532c6","Type":"ContainerStarted","Data":"9b5e710ef570821f4ad761ccc411118e07493cca7a692d3e0832784e55d0f0e5"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.684153 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" event={"ID":"d0b2328a-5711-41d8-b908-79389f55898e","Type":"ContainerStarted","Data":"57a0ce79ba62876fd641b82e33f567170269afabff8412774f67eeb82df3888f"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.685187 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-j6v5d" podStartSLOduration=159.68516854 podStartE2EDuration="2m39.68516854s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.652590422 +0000 UTC m=+212.330313837" watchObservedRunningTime="2026-03-20 13:27:42.68516854 +0000 UTC m=+212.362891935" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.686097 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8nnjz" podStartSLOduration=159.686091125 podStartE2EDuration="2m39.686091125s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.67936803 +0000 UTC m=+212.357091435" watchObservedRunningTime="2026-03-20 13:27:42.686091125 +0000 UTC m=+212.363814520" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.719576 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" event={"ID":"6bb10ff4-21a1-499c-a25f-ce649548a3b4","Type":"ContainerStarted","Data":"68312ea20d7bc153e5076445abde3d6f5efe72dae50490ca1b6a768e7fe86fb1"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.720146 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.738633 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" podStartSLOduration=159.738613602 podStartE2EDuration="2m39.738613602s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.736431022 +0000 UTC m=+212.414154437" watchObservedRunningTime="2026-03-20 13:27:42.738613602 +0000 UTC m=+212.416336997" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.739952 4849 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xrw8n container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.740005 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" podUID="6bb10ff4-21a1-499c-a25f-ce649548a3b4" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.740237 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" event={"ID":"9556c251-d79c-4fec-a027-bf4aeb2fc4f1","Type":"ContainerStarted","Data":"e6d4540dd3c9dc2e4d8782ddfa7e95112769a8f21403c3b01e012b0544371d53"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.747064 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.747393 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.247366163 +0000 UTC m=+212.925089548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.747697 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.750013 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.250002376 +0000 UTC m=+212.927725861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.767398 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k4s2s" podStartSLOduration=159.767381865 podStartE2EDuration="2m39.767381865s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.765737009 +0000 UTC m=+212.443460404" watchObservedRunningTime="2026-03-20 13:27:42.767381865 +0000 UTC m=+212.445105260" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.768254 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" event={"ID":"9e5f2143-15f8-487f-96ab-0242615ed791","Type":"ContainerStarted","Data":"da4ef2b4f3b8e4def17d19f44a4b70ea65992d09094f6cf8325dbfc5817dcce5"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.784665 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" event={"ID":"b606bf18-c941-4fe2-9edf-8e4bf69bdc68","Type":"ContainerStarted","Data":"994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.784715 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.791919 4849 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-v8tw5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.791970 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.806131 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5747m" podStartSLOduration=158.806074791 podStartE2EDuration="2m38.806074791s" podCreationTimestamp="2026-03-20 13:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.79950379 +0000 UTC m=+212.477227185" watchObservedRunningTime="2026-03-20 13:27:42.806074791 +0000 UTC m=+212.483798196" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.827529 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pmc7z" event={"ID":"6dabdf55-eae8-4c83-967c-58ebcc1d4a73","Type":"ContainerStarted","Data":"ac3f9792c52feee5b469bf5304e4c0c510cab40a6e7a2afb8083fcc6c28f7761"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.829042 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.854382 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.855841 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.355800641 +0000 UTC m=+213.033524036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.861891 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" event={"ID":"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd","Type":"ContainerStarted","Data":"331cfea0569f6c546f3110a9615ddbdccdbdc69d67a3c0dfd7919dc40404bb06"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.863385 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" podStartSLOduration=159.863366579 podStartE2EDuration="2m39.863366579s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.828215341 +0000 UTC m=+212.505938756" watchObservedRunningTime="2026-03-20 13:27:42.863366579 +0000 UTC m=+212.541089974" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.863858 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hpdcw" podStartSLOduration=159.863854033 podStartE2EDuration="2m39.863854033s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.857103807 +0000 UTC m=+212.534827202" watchObservedRunningTime="2026-03-20 13:27:42.863854033 +0000 UTC m=+212.541577438" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.883644 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r8cpv" event={"ID":"e4d96b99-5115-44da-92cd-e05b9450b4f0","Type":"ContainerStarted","Data":"7ac3ec0cbdfe7ced009e3574c9e94c5c6d84bc3731b7f818b365525ad2a665dd"} Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.890546 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" podStartSLOduration=159.890533998 podStartE2EDuration="2m39.890533998s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.889731576 +0000 UTC m=+212.567454981" watchObservedRunningTime="2026-03-20 13:27:42.890533998 +0000 UTC m=+212.568257393" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.894765 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.894847 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.912257 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.957757 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:42 crc kubenswrapper[4849]: E0320 13:27:42.960812 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.460797424 +0000 UTC m=+213.138520909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.977620 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9lhk8" podStartSLOduration=159.977586876 podStartE2EDuration="2m39.977586876s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.944093673 +0000 UTC m=+212.621817068" watchObservedRunningTime="2026-03-20 13:27:42.977586876 +0000 UTC m=+212.655310281" Mar 20 13:27:42 crc kubenswrapper[4849]: I0320 13:27:42.978492 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pmc7z" podStartSLOduration=8.978485111 podStartE2EDuration="8.978485111s" podCreationTimestamp="2026-03-20 13:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:42.974529362 +0000 UTC m=+212.652252757" watchObservedRunningTime="2026-03-20 13:27:42.978485111 +0000 UTC m=+212.656208506" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.060576 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.061305 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.561288151 +0000 UTC m=+213.239011546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.163069 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.163505 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.663491617 +0000 UTC m=+213.341215012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.268617 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.269161 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.769140988 +0000 UTC m=+213.446864383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.276030 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:43 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:43 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:43 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.276105 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.370071 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.370454 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.870431539 +0000 UTC m=+213.548154994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.484760 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.485409 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:43.985391406 +0000 UTC m=+213.663114801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.548285 4849 ???:1] "http: TLS handshake error from 192.168.126.11:38420: no serving certificate available for the kubelet" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.587977 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.588260 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.08824768 +0000 UTC m=+213.765971075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.687009 4849 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-s2ph5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 13:27:43 crc kubenswrapper[4849]: [+]log ok Mar 20 13:27:43 crc kubenswrapper[4849]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:27:43 crc kubenswrapper[4849]: [+]poststarthook/max-in-flight-filter ok Mar 20 13:27:43 crc kubenswrapper[4849]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Mar 20 13:27:43 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.687072 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" podUID="d488e843-a3e6-48fd-ab56-98836840aa40" containerName="packageserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.691642 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.691780 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.191760142 +0000 UTC m=+213.869483537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.691940 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.692244 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.192236915 +0000 UTC m=+213.869960310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.792847 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.793022 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.292996731 +0000 UTC m=+213.970720126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.793161 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.793460 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.293448673 +0000 UTC m=+213.971172088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.893880 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.894059 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.394032514 +0000 UTC m=+214.071755909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.894206 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.894534 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.394526618 +0000 UTC m=+214.072250013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.917427 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" event={"ID":"993eca69-6343-4f10-95d0-7f2def6430d6","Type":"ContainerStarted","Data":"6e9f33c11b88113697cae057c0eabbbc76c410ca9edfa2e2491006c8a8609a23"} Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.933484 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pmc7z" event={"ID":"6dabdf55-eae8-4c83-967c-58ebcc1d4a73","Type":"ContainerStarted","Data":"28f1a7288c91878c1575352cb42708f3465f76a868d5427b5127b2345ef8ab8e"} Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.938070 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r8cpv" podStartSLOduration=9.938053107 podStartE2EDuration="9.938053107s" podCreationTimestamp="2026-03-20 13:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:43.11132603 +0000 UTC m=+212.789049445" watchObservedRunningTime="2026-03-20 13:27:43.938053107 +0000 UTC m=+213.615776512" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.938429 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xfgn4" podStartSLOduration=160.938421757 podStartE2EDuration="2m40.938421757s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:43.935920969 +0000 UTC m=+213.613644384" watchObservedRunningTime="2026-03-20 13:27:43.938421757 +0000 UTC m=+213.616145152" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.948907 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" event={"ID":"3c1a4d40-f2ed-4e90-a235-5cbc372ade36","Type":"ContainerStarted","Data":"0a0aedec510ca38f1e8ce4d3598290fd7b4bdb02e995f8ce157465187d8fbd3b"} Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.950626 4849 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-v8tw5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.950684 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.965153 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-bj5n2" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.974004 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-s2ph5" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.975084 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xrw8n" Mar 20 13:27:43 crc kubenswrapper[4849]: I0320 13:27:43.997467 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:43 crc kubenswrapper[4849]: E0320 13:27:43.997928 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.497904776 +0000 UTC m=+214.175628171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.100419 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.105559 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.605545082 +0000 UTC m=+214.283268477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.207047 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.207460 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.707440969 +0000 UTC m=+214.385164364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.278627 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:44 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:44 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:44 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.278691 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.308783 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.309080 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.809068829 +0000 UTC m=+214.486792224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.410479 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.410668 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.910636978 +0000 UTC m=+214.588360373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.410971 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.411276 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:44.911265595 +0000 UTC m=+214.588988990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.513530 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.513734 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.013708267 +0000 UTC m=+214.691431662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.513982 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.514367 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.014341095 +0000 UTC m=+214.692064480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.615667 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.615837 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.11579871 +0000 UTC m=+214.793522105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.615980 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.616297 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.116286073 +0000 UTC m=+214.794009468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.717493 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.717740 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.217712058 +0000 UTC m=+214.895435453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.819481 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.819964 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.319943384 +0000 UTC m=+214.997666849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.906016 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d5hbn"] Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.906281 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" containerID="cri-o://da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b" gracePeriod=30 Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.914201 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.922091 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:44 crc kubenswrapper[4849]: E0320 13:27:44.922638 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.422616923 +0000 UTC m=+215.100340318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.950377 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8"] Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.950621 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerName="route-controller-manager" containerID="cri-o://24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6" gracePeriod=30 Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.993216 4849 generic.go:334] "Generic (PLEG): container finished" podID="f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" containerID="331cfea0569f6c546f3110a9615ddbdccdbdc69d67a3c0dfd7919dc40404bb06" exitCode=0 Mar 20 13:27:44 crc kubenswrapper[4849]: I0320 13:27:44.993307 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" event={"ID":"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd","Type":"ContainerDied","Data":"331cfea0569f6c546f3110a9615ddbdccdbdc69d67a3c0dfd7919dc40404bb06"} Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.032668 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" event={"ID":"3c1a4d40-f2ed-4e90-a235-5cbc372ade36","Type":"ContainerStarted","Data":"34a862a5c6e0de71aee5dd97ac20ff8727be08d04df7214e52745c99bdd0dd3c"} Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.034059 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:45 crc kubenswrapper[4849]: E0320 13:27:45.034374 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.534358312 +0000 UTC m=+215.212081707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.135089 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: E0320 13:27:45.135334 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.635296973 +0000 UTC m=+215.313020368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.136639 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:45 crc kubenswrapper[4849]: E0320 13:27:45.137174 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.637159764 +0000 UTC m=+215.314883359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ttnt5" (UID: "db498458-18d4-4142-b536-3141889616e1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.248697 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: E0320 13:27:45.249269 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:27:45.749249382 +0000 UTC m=+215.426972777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.252722 4849 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.255832 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xx4fv"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.258603 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.263907 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.276944 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xx4fv"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.282738 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:45 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:45 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:45 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.284250 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.345295 4849 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T13:27:45.252767319Z","Handler":null,"Name":""} Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.349434 4849 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.349474 4849 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.349962 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.350004 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-catalog-content\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.350063 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-utilities\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.350112 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdlp7\" (UniqueName: \"kubernetes.io/projected/63553d28-5dba-492e-b004-043ea30ee635-kube-api-access-qdlp7\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.352202 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.360751 4849 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.360830 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.362112 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-g4582" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.445858 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ttnt5\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.453631 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-catalog-content\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.454060 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-utilities\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.454153 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdlp7\" (UniqueName: \"kubernetes.io/projected/63553d28-5dba-492e-b004-043ea30ee635-kube-api-access-qdlp7\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.455675 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-catalog-content\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.455788 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-utilities\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.483311 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ft4dw"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.484518 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.513285 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.516691 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft4dw"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.556492 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.556699 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkxh8\" (UniqueName: \"kubernetes.io/projected/b7396166-d1a2-4565-8ccc-3ed06ce215f4-kube-api-access-hkxh8\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.556726 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-utilities\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.557040 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-catalog-content\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.565306 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdlp7\" (UniqueName: \"kubernetes.io/projected/63553d28-5dba-492e-b004-043ea30ee635-kube-api-access-qdlp7\") pod \"community-operators-xx4fv\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.590791 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.591526 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h75cn" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.624688 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.629658 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.643061 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5zspj"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.644888 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.656805 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zspj"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.658211 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-catalog-content\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.658537 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkxh8\" (UniqueName: \"kubernetes.io/projected/b7396166-d1a2-4565-8ccc-3ed06ce215f4-kube-api-access-hkxh8\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.658662 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-utilities\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.659271 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-utilities\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.659889 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-catalog-content\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.705726 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkxh8\" (UniqueName: \"kubernetes.io/projected/b7396166-d1a2-4565-8ccc-3ed06ce215f4-kube-api-access-hkxh8\") pod \"certified-operators-ft4dw\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.715274 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.745114 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.745758 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.748897 4849 patch_prober.go:28] interesting pod/console-f9d7485db-ztzl5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.748942 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ztzl5" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.759737 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.764426 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-utilities\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.764521 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-catalog-content\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.764552 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4g5\" (UniqueName: \"kubernetes.io/projected/5e607d4a-4c18-4de3-9b29-c5f32fadee50-kube-api-access-xf4g5\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.839361 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qf9r2"] Mar 20 13:27:45 crc kubenswrapper[4849]: E0320 13:27:45.840107 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerName="route-controller-manager" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.840127 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerName="route-controller-manager" Mar 20 13:27:45 crc kubenswrapper[4849]: E0320 13:27:45.840140 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.840149 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.840256 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerName="route-controller-manager" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.840276 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.841105 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.855017 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qf9r2"] Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.893301 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.893755 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/060c5f02-9012-48d7-9f95-3677026da844-serving-cert\") pod \"060c5f02-9012-48d7-9f95-3677026da844\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.893871 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-client-ca\") pod \"753e3beb-9e10-4739-ad79-6ac49313ca7b\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.893906 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-config\") pod \"753e3beb-9e10-4739-ad79-6ac49313ca7b\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.893947 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-client-ca\") pod \"060c5f02-9012-48d7-9f95-3677026da844\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894020 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frd9f\" (UniqueName: \"kubernetes.io/projected/753e3beb-9e10-4739-ad79-6ac49313ca7b-kube-api-access-frd9f\") pod \"753e3beb-9e10-4739-ad79-6ac49313ca7b\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894064 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkhpg\" (UniqueName: \"kubernetes.io/projected/060c5f02-9012-48d7-9f95-3677026da844-kube-api-access-kkhpg\") pod \"060c5f02-9012-48d7-9f95-3677026da844\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894104 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-proxy-ca-bundles\") pod \"060c5f02-9012-48d7-9f95-3677026da844\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894135 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-config\") pod \"060c5f02-9012-48d7-9f95-3677026da844\" (UID: \"060c5f02-9012-48d7-9f95-3677026da844\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894164 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753e3beb-9e10-4739-ad79-6ac49313ca7b-serving-cert\") pod \"753e3beb-9e10-4739-ad79-6ac49313ca7b\" (UID: \"753e3beb-9e10-4739-ad79-6ac49313ca7b\") " Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894476 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-utilities\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894617 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-catalog-content\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.894642 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4g5\" (UniqueName: \"kubernetes.io/projected/5e607d4a-4c18-4de3-9b29-c5f32fadee50-kube-api-access-xf4g5\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.895276 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-config" (OuterVolumeSpecName: "config") pod "753e3beb-9e10-4739-ad79-6ac49313ca7b" (UID: "753e3beb-9e10-4739-ad79-6ac49313ca7b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.895609 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-client-ca" (OuterVolumeSpecName: "client-ca") pod "753e3beb-9e10-4739-ad79-6ac49313ca7b" (UID: "753e3beb-9e10-4739-ad79-6ac49313ca7b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.897247 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-catalog-content\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.899140 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "060c5f02-9012-48d7-9f95-3677026da844" (UID: "060c5f02-9012-48d7-9f95-3677026da844"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.899591 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-client-ca" (OuterVolumeSpecName: "client-ca") pod "060c5f02-9012-48d7-9f95-3677026da844" (UID: "060c5f02-9012-48d7-9f95-3677026da844"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.899913 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-utilities\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.900522 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-config" (OuterVolumeSpecName: "config") pod "060c5f02-9012-48d7-9f95-3677026da844" (UID: "060c5f02-9012-48d7-9f95-3677026da844"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.906376 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753e3beb-9e10-4739-ad79-6ac49313ca7b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "753e3beb-9e10-4739-ad79-6ac49313ca7b" (UID: "753e3beb-9e10-4739-ad79-6ac49313ca7b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.906416 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060c5f02-9012-48d7-9f95-3677026da844-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "060c5f02-9012-48d7-9f95-3677026da844" (UID: "060c5f02-9012-48d7-9f95-3677026da844"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.911784 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753e3beb-9e10-4739-ad79-6ac49313ca7b-kube-api-access-frd9f" (OuterVolumeSpecName: "kube-api-access-frd9f") pod "753e3beb-9e10-4739-ad79-6ac49313ca7b" (UID: "753e3beb-9e10-4739-ad79-6ac49313ca7b"). InnerVolumeSpecName "kube-api-access-frd9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.917843 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060c5f02-9012-48d7-9f95-3677026da844-kube-api-access-kkhpg" (OuterVolumeSpecName: "kube-api-access-kkhpg") pod "060c5f02-9012-48d7-9f95-3677026da844" (UID: "060c5f02-9012-48d7-9f95-3677026da844"). InnerVolumeSpecName "kube-api-access-kkhpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.936400 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4g5\" (UniqueName: \"kubernetes.io/projected/5e607d4a-4c18-4de3-9b29-c5f32fadee50-kube-api-access-xf4g5\") pod \"community-operators-5zspj\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.972601 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.996722 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-catalog-content\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.996889 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-utilities\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.996956 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fvxk\" (UniqueName: \"kubernetes.io/projected/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-kube-api-access-9fvxk\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997092 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/060c5f02-9012-48d7-9f95-3677026da844-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997107 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997203 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/753e3beb-9e10-4739-ad79-6ac49313ca7b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997246 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997264 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frd9f\" (UniqueName: \"kubernetes.io/projected/753e3beb-9e10-4739-ad79-6ac49313ca7b-kube-api-access-frd9f\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997283 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkhpg\" (UniqueName: \"kubernetes.io/projected/060c5f02-9012-48d7-9f95-3677026da844-kube-api-access-kkhpg\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997298 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997311 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/060c5f02-9012-48d7-9f95-3677026da844-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4849]: I0320 13:27:45.997323 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/753e3beb-9e10-4739-ad79-6ac49313ca7b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.041175 4849 generic.go:334] "Generic (PLEG): container finished" podID="060c5f02-9012-48d7-9f95-3677026da844" containerID="da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b" exitCode=0 Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.041244 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" event={"ID":"060c5f02-9012-48d7-9f95-3677026da844","Type":"ContainerDied","Data":"da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b"} Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.041274 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" event={"ID":"060c5f02-9012-48d7-9f95-3677026da844","Type":"ContainerDied","Data":"7955d9f9a67af7d4193d2de65862c7a4fd11a9919514595f7e7bed7d4a49bbb7"} Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.041296 4849 scope.go:117] "RemoveContainer" containerID="da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.041539 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.046793 4849 generic.go:334] "Generic (PLEG): container finished" podID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerID="24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6" exitCode=0 Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.046874 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.046891 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" event={"ID":"753e3beb-9e10-4739-ad79-6ac49313ca7b","Type":"ContainerDied","Data":"24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6"} Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.047398 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" event={"ID":"753e3beb-9e10-4739-ad79-6ac49313ca7b","Type":"ContainerDied","Data":"8c0af0a95631cb2f64fcaca2ee7cc8a8411e95513cf8e91d125ace49a605df0d"} Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.051657 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" event={"ID":"3c1a4d40-f2ed-4e90-a235-5cbc372ade36","Type":"ContainerStarted","Data":"21fffd916f2e5ce1ae8053be9e7d82b19224b18e84dfbbbf778534763ad151f1"} Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.051695 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" event={"ID":"3c1a4d40-f2ed-4e90-a235-5cbc372ade36","Type":"ContainerStarted","Data":"a50220598303dd2d797ec84d2988072795864ca472ae02d958cd842e287c6226"} Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.087777 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7b6nh" podStartSLOduration=12.087750774 podStartE2EDuration="12.087750774s" podCreationTimestamp="2026-03-20 13:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:46.08068931 +0000 UTC m=+215.758412725" watchObservedRunningTime="2026-03-20 13:27:46.087750774 +0000 UTC m=+215.765474169" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.096978 4849 scope.go:117] "RemoveContainer" containerID="da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b" Mar 20 13:27:46 crc kubenswrapper[4849]: E0320 13:27:46.098020 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b\": container with ID starting with da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b not found: ID does not exist" containerID="da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.098052 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b"} err="failed to get container status \"da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b\": rpc error: code = NotFound desc = could not find container \"da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b\": container with ID starting with da450ffc04782c908a4026be09b3d32a62bec92ab3c610c1d33bea2b6fb2030b not found: ID does not exist" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.098078 4849 scope.go:117] "RemoveContainer" containerID="24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.098482 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fvxk\" (UniqueName: \"kubernetes.io/projected/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-kube-api-access-9fvxk\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.098587 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-catalog-content\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.098667 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-utilities\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.099873 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-utilities\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.109601 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-catalog-content\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.121003 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fvxk\" (UniqueName: \"kubernetes.io/projected/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-kube-api-access-9fvxk\") pod \"certified-operators-qf9r2\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.136032 4849 scope.go:117] "RemoveContainer" containerID="24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6" Mar 20 13:27:46 crc kubenswrapper[4849]: E0320 13:27:46.137305 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6\": container with ID starting with 24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6 not found: ID does not exist" containerID="24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.137341 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6"} err="failed to get container status \"24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6\": rpc error: code = NotFound desc = could not find container \"24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6\": container with ID starting with 24f8b642f7f5db97599b05ca5374eddb00022c110c8a28025ff2ab3b445218b6 not found: ID does not exist" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.138732 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.141640 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.157780 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d5hbn"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.157854 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-d5hbn"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.245854 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6654475965-prz44"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.246542 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.251150 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9c84796dd-7xmf7"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.251728 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.253234 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.253350 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.253445 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.253546 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.253646 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.253737 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.258011 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.258556 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.258675 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.258910 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.258920 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.259143 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.280142 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9c84796dd-7xmf7"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.283124 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.291288 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6654475965-prz44"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.303139 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:46 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:46 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:46 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.303194 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.390731 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.401228 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ttnt5"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407172 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-client-ca\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407222 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07555f73-9ea2-419b-9e3f-da6dba21c0e8-serving-cert\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407246 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-config\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407265 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122c80d4-7b66-4509-b228-f22910b0963a-serving-cert\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407300 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2tc4\" (UniqueName: \"kubernetes.io/projected/122c80d4-7b66-4509-b228-f22910b0963a-kube-api-access-f2tc4\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407322 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-config\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407338 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxlpx\" (UniqueName: \"kubernetes.io/projected/07555f73-9ea2-419b-9e3f-da6dba21c0e8-kube-api-access-kxlpx\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407363 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-proxy-ca-bundles\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.407393 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-client-ca\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.414290 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xx4fv"] Mar 20 13:27:46 crc kubenswrapper[4849]: W0320 13:27:46.415781 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb498458_18d4_4142_b536_3141889616e1.slice/crio-0b013f46734ac74622304560496562bd5079999bd0eab925dc5aa2e9c338bbd4 WatchSource:0}: Error finding container 0b013f46734ac74622304560496562bd5079999bd0eab925dc5aa2e9c338bbd4: Status 404 returned error can't find the container with id 0b013f46734ac74622304560496562bd5079999bd0eab925dc5aa2e9c338bbd4 Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.508741 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07555f73-9ea2-419b-9e3f-da6dba21c0e8-serving-cert\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.508828 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-config\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.508861 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122c80d4-7b66-4509-b228-f22910b0963a-serving-cert\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.508923 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2tc4\" (UniqueName: \"kubernetes.io/projected/122c80d4-7b66-4509-b228-f22910b0963a-kube-api-access-f2tc4\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.508963 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-config\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.508987 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxlpx\" (UniqueName: \"kubernetes.io/projected/07555f73-9ea2-419b-9e3f-da6dba21c0e8-kube-api-access-kxlpx\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.509025 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-proxy-ca-bundles\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.509067 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-client-ca\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.509173 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-client-ca\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.511343 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-proxy-ca-bundles\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.514048 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-client-ca\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.514944 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-client-ca\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.516351 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-config\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.525532 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-config\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.526301 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.527065 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122c80d4-7b66-4509-b228-f22910b0963a-serving-cert\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.537147 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2tc4\" (UniqueName: \"kubernetes.io/projected/122c80d4-7b66-4509-b228-f22910b0963a-kube-api-access-f2tc4\") pod \"route-controller-manager-6654475965-prz44\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.542844 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxlpx\" (UniqueName: \"kubernetes.io/projected/07555f73-9ea2-419b-9e3f-da6dba21c0e8-kube-api-access-kxlpx\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.546580 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07555f73-9ea2-419b-9e3f-da6dba21c0e8-serving-cert\") pod \"controller-manager-9c84796dd-7xmf7\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.571642 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.571690 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.572041 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.572056 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.580156 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.594172 4849 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-d5hbn container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": context deadline exceeded" start-of-body= Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.594318 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-d5hbn" podUID="060c5f02-9012-48d7-9f95-3677026da844" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": context deadline exceeded" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.607814 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.662010 4849 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lrbs8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.662076 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lrbs8" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.713197 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-secret-volume\") pod \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.713314 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-config-volume\") pod \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.713370 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r9r9\" (UniqueName: \"kubernetes.io/projected/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-kube-api-access-4r9r9\") pod \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\" (UID: \"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd\") " Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.718121 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-config-volume" (OuterVolumeSpecName: "config-volume") pod "f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" (UID: "f5a3fb60-fb35-41cf-af91-ce3bb50e5edd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.718996 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-kube-api-access-4r9r9" (OuterVolumeSpecName: "kube-api-access-4r9r9") pod "f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" (UID: "f5a3fb60-fb35-41cf-af91-ce3bb50e5edd"). InnerVolumeSpecName "kube-api-access-4r9r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.728169 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" (UID: "f5a3fb60-fb35-41cf-af91-ce3bb50e5edd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.817641 4849 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.817689 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r9r9\" (UniqueName: \"kubernetes.io/projected/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-kube-api-access-4r9r9\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.817702 4849 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f5a3fb60-fb35-41cf-af91-ce3bb50e5edd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.830764 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ft4dw"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.848157 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5zspj"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.889880 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6654475965-prz44"] Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.918707 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.918787 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.918840 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.918876 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.925084 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.930388 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.931076 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:46 crc kubenswrapper[4849]: I0320 13:27:46.938660 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.020396 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.025663 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ca35818-87a2-4dac-ad57-310ffe701961-metrics-certs\") pod \"network-metrics-daemon-vm768\" (UID: \"8ca35818-87a2-4dac-ad57-310ffe701961\") " pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.058002 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.067770 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.124837 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060c5f02-9012-48d7-9f95-3677026da844" path="/var/lib/kubelet/pods/060c5f02-9012-48d7-9f95-3677026da844/volumes" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.125466 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.126615 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753e3beb-9e10-4739-ad79-6ac49313ca7b" path="/var/lib/kubelet/pods/753e3beb-9e10-4739-ad79-6ac49313ca7b/volumes" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.140001 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.141974 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qf9r2"] Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.148767 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vm768" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.172796 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" event={"ID":"db498458-18d4-4142-b536-3141889616e1","Type":"ContainerStarted","Data":"fa8ae48cdb74c441faf5f22e6576e02830f8d8cb6cb50408a054b25590f0fb24"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.173190 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" event={"ID":"db498458-18d4-4142-b536-3141889616e1","Type":"ContainerStarted","Data":"0b013f46734ac74622304560496562bd5079999bd0eab925dc5aa2e9c338bbd4"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.173867 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.179985 4849 generic.go:334] "Generic (PLEG): container finished" podID="63553d28-5dba-492e-b004-043ea30ee635" containerID="02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a" exitCode=0 Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.181428 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerDied","Data":"02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.185651 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerStarted","Data":"3d5483b02561a2783925fa8b6014a8301fe5c0d2f7c8ea86768f263f8c85113f"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.190563 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerStarted","Data":"4eea9cb47fa7c872efd7b16ed772ab07dde18f98704865ab4f053c8a7a66a3ed"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.208996 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft4dw" event={"ID":"b7396166-d1a2-4565-8ccc-3ed06ce215f4","Type":"ContainerStarted","Data":"e77fc277cfd42e3e90d30e1a55f30ddbc19bd108f9441c54a6ab5aff35b60d9a"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.212487 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.212902 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-jr47q" event={"ID":"f5a3fb60-fb35-41cf-af91-ce3bb50e5edd","Type":"ContainerDied","Data":"5b804b9e6524b753e215edeab7d1b51910fefb9760f366b8df2285adb0b53b90"} Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.212920 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b804b9e6524b753e215edeab7d1b51910fefb9760f366b8df2285adb0b53b90" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.212934 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6654475965-prz44"] Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.216602 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" podStartSLOduration=164.216585064 podStartE2EDuration="2m44.216585064s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:47.207238446 +0000 UTC m=+216.884961841" watchObservedRunningTime="2026-03-20 13:27:47.216585064 +0000 UTC m=+216.894308469" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.248168 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9c84796dd-7xmf7"] Mar 20 13:27:47 crc kubenswrapper[4849]: W0320 13:27:47.248584 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122c80d4_7b66_4509_b228_f22910b0963a.slice/crio-92748211cb6f2ab500fd2a79fa2364d3a4199da2144864b7d46c9558466e135f WatchSource:0}: Error finding container 92748211cb6f2ab500fd2a79fa2364d3a4199da2144864b7d46c9558466e135f: Status 404 returned error can't find the container with id 92748211cb6f2ab500fd2a79fa2364d3a4199da2144864b7d46c9558466e135f Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.272361 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.276871 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:47 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:47 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:47 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.276908 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.395154 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.435894 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dgk97"] Mar 20 13:27:47 crc kubenswrapper[4849]: E0320 13:27:47.441750 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" containerName="collect-profiles" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.441783 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" containerName="collect-profiles" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.441938 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5a3fb60-fb35-41cf-af91-ce3bb50e5edd" containerName="collect-profiles" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.442622 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgk97"] Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.442714 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.449631 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.542448 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-catalog-content\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.542516 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-utilities\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.542552 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvpp\" (UniqueName: \"kubernetes.io/projected/b7e8bcae-39ef-4786-b2b8-18dea74380fa-kube-api-access-9vvpp\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.650355 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-catalog-content\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.650402 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-utilities\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.652041 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvpp\" (UniqueName: \"kubernetes.io/projected/b7e8bcae-39ef-4786-b2b8-18dea74380fa-kube-api-access-9vvpp\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.657092 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-utilities\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.657247 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-catalog-content\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.675410 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvpp\" (UniqueName: \"kubernetes.io/projected/b7e8bcae-39ef-4786-b2b8-18dea74380fa-kube-api-access-9vvpp\") pod \"redhat-marketplace-dgk97\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.770349 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vm768"] Mar 20 13:27:47 crc kubenswrapper[4849]: W0320 13:27:47.791652 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ca35818_87a2_4dac_ad57_310ffe701961.slice/crio-99fb94dc7416311a3514b568de087dba6e61a31eb325aa14fec0600cc943c4d8 WatchSource:0}: Error finding container 99fb94dc7416311a3514b568de087dba6e61a31eb325aa14fec0600cc943c4d8: Status 404 returned error can't find the container with id 99fb94dc7416311a3514b568de087dba6e61a31eb325aa14fec0600cc943c4d8 Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.837900 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zwxp"] Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.838936 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.861084 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zwxp"] Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.916542 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.961264 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbb4d\" (UniqueName: \"kubernetes.io/projected/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-kube-api-access-nbb4d\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.961349 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-utilities\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:47 crc kubenswrapper[4849]: I0320 13:27:47.961409 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-catalog-content\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.075628 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbb4d\" (UniqueName: \"kubernetes.io/projected/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-kube-api-access-nbb4d\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.075706 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-utilities\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.075765 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-catalog-content\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.076307 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-catalog-content\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.076346 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-utilities\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.115394 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbb4d\" (UniqueName: \"kubernetes.io/projected/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-kube-api-access-nbb4d\") pod \"redhat-marketplace-6zwxp\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.159593 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.254914 4849 generic.go:334] "Generic (PLEG): container finished" podID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerID="509874ccafb17caeb23da1e9f48b80faca8b480febc2c9037da606b6b7708a63" exitCode=0 Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.256457 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerDied","Data":"509874ccafb17caeb23da1e9f48b80faca8b480febc2c9037da606b6b7708a63"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.256488 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerStarted","Data":"ef26b5e699f9eda4372e9cccacac3868b1a718aa32de1daac348a66b5a75d0c2"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.285504 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:48 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:48 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:48 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.285567 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.289333 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2e36dc98e53eb62128daca80029202386de304c61d9975ef1f758468022b8572"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.289385 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a0515f027e4c908e6af9dfd774fe260dc9fe593678bea08edc18e6dccda2a587"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.292568 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"68227eb5802f11cbcd6e29f89b8c2618261a1e37e25dec40c1e291ac6e7f2a62"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.292604 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b123eaf2c2cc3adeee99b9c4cbf5b600e2e0b178d6a5680b78bcc524c09c84f1"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.294403 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"07ca477a2de7d07c724a596dc3f8b677196eb282829fdef9b676f75ccd91a264"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.294426 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7faf08dcb29f6416ecd2856c549d82d9d997d92885e52105a1e44d3856e926d5"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.296426 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.298905 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" event={"ID":"122c80d4-7b66-4509-b228-f22910b0963a","Type":"ContainerStarted","Data":"593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.298936 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" event={"ID":"122c80d4-7b66-4509-b228-f22910b0963a","Type":"ContainerStarted","Data":"92748211cb6f2ab500fd2a79fa2364d3a4199da2144864b7d46c9558466e135f"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.299060 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" podUID="122c80d4-7b66-4509-b228-f22910b0963a" containerName="route-controller-manager" containerID="cri-o://593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1" gracePeriod=30 Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.299491 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.304655 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vm768" event={"ID":"8ca35818-87a2-4dac-ad57-310ffe701961","Type":"ContainerStarted","Data":"99fb94dc7416311a3514b568de087dba6e61a31eb325aa14fec0600cc943c4d8"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.309145 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" event={"ID":"07555f73-9ea2-419b-9e3f-da6dba21c0e8","Type":"ContainerStarted","Data":"2a2f9bd21e2ca3f104edb03c267f6465cbd3f1b192cd99635ee518165570c362"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.309183 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" event={"ID":"07555f73-9ea2-419b-9e3f-da6dba21c0e8","Type":"ContainerStarted","Data":"dda8c231db34d886bf81984497879b234ac4df7ccfb6b47acec7891d42151630"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.310197 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.313178 4849 generic.go:334] "Generic (PLEG): container finished" podID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerID="1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a" exitCode=0 Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.313318 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerDied","Data":"1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.329036 4849 patch_prober.go:28] interesting pod/route-controller-manager-6654475965-prz44 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": read tcp 10.217.0.2:47994->10.217.0.49:8443: read: connection reset by peer" start-of-body= Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.329096 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" podUID="122c80d4-7b66-4509-b228-f22910b0963a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": read tcp 10.217.0.2:47994->10.217.0.49:8443: read: connection reset by peer" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.329336 4849 generic.go:334] "Generic (PLEG): container finished" podID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerID="b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff" exitCode=0 Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.329387 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft4dw" event={"ID":"b7396166-d1a2-4565-8ccc-3ed06ce215f4","Type":"ContainerDied","Data":"b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff"} Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.330174 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.343116 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" podStartSLOduration=3.343092341 podStartE2EDuration="3.343092341s" podCreationTimestamp="2026-03-20 13:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:48.340404097 +0000 UTC m=+218.018127512" watchObservedRunningTime="2026-03-20 13:27:48.343092341 +0000 UTC m=+218.020815736" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.394953 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" podStartSLOduration=3.394915048 podStartE2EDuration="3.394915048s" podCreationTimestamp="2026-03-20 13:27:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:48.357198069 +0000 UTC m=+218.034921484" watchObservedRunningTime="2026-03-20 13:27:48.394915048 +0000 UTC m=+218.072638453" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.494357 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lnk65"] Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.496336 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.499401 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.516897 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lnk65"] Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.593614 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-utilities\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.593697 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm282\" (UniqueName: \"kubernetes.io/projected/02c87e15-4f0c-422f-812b-5a4bcbf1b639-kube-api-access-xm282\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.593728 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-catalog-content\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.695914 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-utilities\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.696028 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm282\" (UniqueName: \"kubernetes.io/projected/02c87e15-4f0c-422f-812b-5a4bcbf1b639-kube-api-access-xm282\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.696058 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-catalog-content\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.697311 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-utilities\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.697419 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-catalog-content\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.722717 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm282\" (UniqueName: \"kubernetes.io/projected/02c87e15-4f0c-422f-812b-5a4bcbf1b639-kube-api-access-xm282\") pod \"redhat-operators-lnk65\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.727380 4849 ???:1] "http: TLS handshake error from 192.168.126.11:38434: no serving certificate available for the kubelet" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.730192 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zwxp"] Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.804693 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgk97"] Mar 20 13:27:48 crc kubenswrapper[4849]: W0320 13:27:48.829831 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e8bcae_39ef_4786_b2b8_18dea74380fa.slice/crio-6dbb96491077bcbb88392d125176c9ccbd562b7dbc3bff56a02f54a474c87d93 WatchSource:0}: Error finding container 6dbb96491077bcbb88392d125176c9ccbd562b7dbc3bff56a02f54a474c87d93: Status 404 returned error can't find the container with id 6dbb96491077bcbb88392d125176c9ccbd562b7dbc3bff56a02f54a474c87d93 Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.853471 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v2fkf"] Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.854516 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.861931 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.876195 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2fkf"] Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.897608 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-utilities\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.897693 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ndjl\" (UniqueName: \"kubernetes.io/projected/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-kube-api-access-4ndjl\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:48 crc kubenswrapper[4849]: I0320 13:27:48.897713 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-catalog-content\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.000778 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-utilities\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.000892 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ndjl\" (UniqueName: \"kubernetes.io/projected/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-kube-api-access-4ndjl\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.000928 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-catalog-content\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.001448 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-catalog-content\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.002251 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-utilities\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.022123 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ndjl\" (UniqueName: \"kubernetes.io/projected/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-kube-api-access-4ndjl\") pod \"redhat-operators-v2fkf\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.101297 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.147929 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x"] Mar 20 13:27:49 crc kubenswrapper[4849]: E0320 13:27:49.150571 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122c80d4-7b66-4509-b228-f22910b0963a" containerName="route-controller-manager" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.150605 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="122c80d4-7b66-4509-b228-f22910b0963a" containerName="route-controller-manager" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.150883 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="122c80d4-7b66-4509-b228-f22910b0963a" containerName="route-controller-manager" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.151576 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.155657 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.203669 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122c80d4-7b66-4509-b228-f22910b0963a-serving-cert\") pod \"122c80d4-7b66-4509-b228-f22910b0963a\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.203731 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-client-ca\") pod \"122c80d4-7b66-4509-b228-f22910b0963a\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.203781 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2tc4\" (UniqueName: \"kubernetes.io/projected/122c80d4-7b66-4509-b228-f22910b0963a-kube-api-access-f2tc4\") pod \"122c80d4-7b66-4509-b228-f22910b0963a\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.203837 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-config\") pod \"122c80d4-7b66-4509-b228-f22910b0963a\" (UID: \"122c80d4-7b66-4509-b228-f22910b0963a\") " Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.204118 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-client-ca\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.204158 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-serving-cert\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.204190 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntvq\" (UniqueName: \"kubernetes.io/projected/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-kube-api-access-nntvq\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.204267 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-config\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.206330 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-client-ca" (OuterVolumeSpecName: "client-ca") pod "122c80d4-7b66-4509-b228-f22910b0963a" (UID: "122c80d4-7b66-4509-b228-f22910b0963a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.206359 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-config" (OuterVolumeSpecName: "config") pod "122c80d4-7b66-4509-b228-f22910b0963a" (UID: "122c80d4-7b66-4509-b228-f22910b0963a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.222035 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122c80d4-7b66-4509-b228-f22910b0963a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "122c80d4-7b66-4509-b228-f22910b0963a" (UID: "122c80d4-7b66-4509-b228-f22910b0963a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.232030 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122c80d4-7b66-4509-b228-f22910b0963a-kube-api-access-f2tc4" (OuterVolumeSpecName: "kube-api-access-f2tc4") pod "122c80d4-7b66-4509-b228-f22910b0963a" (UID: "122c80d4-7b66-4509-b228-f22910b0963a"). InnerVolumeSpecName "kube-api-access-f2tc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.268246 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.286041 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:49 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:49 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:49 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.286095 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.307964 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-config\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308043 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-client-ca\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308079 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-serving-cert\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308104 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntvq\" (UniqueName: \"kubernetes.io/projected/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-kube-api-access-nntvq\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308170 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2tc4\" (UniqueName: \"kubernetes.io/projected/122c80d4-7b66-4509-b228-f22910b0963a-kube-api-access-f2tc4\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308189 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308201 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/122c80d4-7b66-4509-b228-f22910b0963a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.308212 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/122c80d4-7b66-4509-b228-f22910b0963a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.310393 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-config\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.311024 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-client-ca\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.321532 4849 ???:1] "http: TLS handshake error from 192.168.126.11:38438: no serving certificate available for the kubelet" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.322075 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-serving-cert\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.344187 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vm768" event={"ID":"8ca35818-87a2-4dac-ad57-310ffe701961","Type":"ContainerStarted","Data":"018ec9022c7165bcdd1a08f6cdfd8e69f82d1782878255321754d961149905a0"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.344235 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vm768" event={"ID":"8ca35818-87a2-4dac-ad57-310ffe701961","Type":"ContainerStarted","Data":"4fee2b7dffd137000ca1ff8f6c9ac8d61ff67d9f6879c2409b31d258d48e9259"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.348666 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerID="ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe" exitCode=0 Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.348735 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerDied","Data":"ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.348762 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerStarted","Data":"c7e51cbed536940c4857fd57ff7693bd7ec380c83bebce4e4bf8c514f2cfaa5d"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.351317 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntvq\" (UniqueName: \"kubernetes.io/projected/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-kube-api-access-nntvq\") pod \"route-controller-manager-59d8f7b698-vl82x\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.377851 4849 generic.go:334] "Generic (PLEG): container finished" podID="122c80d4-7b66-4509-b228-f22910b0963a" containerID="593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1" exitCode=0 Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.377939 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" event={"ID":"122c80d4-7b66-4509-b228-f22910b0963a","Type":"ContainerDied","Data":"593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.377964 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" event={"ID":"122c80d4-7b66-4509-b228-f22910b0963a","Type":"ContainerDied","Data":"92748211cb6f2ab500fd2a79fa2364d3a4199da2144864b7d46c9558466e135f"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.377981 4849 scope.go:117] "RemoveContainer" containerID="593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.382057 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.382734 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.383386 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6654475965-prz44" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.402328 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.408497 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vm768" podStartSLOduration=166.408477734 podStartE2EDuration="2m46.408477734s" podCreationTimestamp="2026-03-20 13:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:49.401258345 +0000 UTC m=+219.078981750" watchObservedRunningTime="2026-03-20 13:27:49.408477734 +0000 UTC m=+219.086201139" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.409812 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e61ed8df-7b9d-4253-9925-95ce4e6483af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.409906 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e61ed8df-7b9d-4253-9925-95ce4e6483af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.412661 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.414473 4849 generic.go:334] "Generic (PLEG): container finished" podID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerID="2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f" exitCode=0 Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.414668 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerDied","Data":"2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.414716 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerStarted","Data":"6dbb96491077bcbb88392d125176c9ccbd562b7dbc3bff56a02f54a474c87d93"} Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.418729 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.514295 4849 scope.go:117] "RemoveContainer" containerID="593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.531995 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6654475965-prz44"] Mar 20 13:27:49 crc kubenswrapper[4849]: E0320 13:27:49.533603 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1\": container with ID starting with 593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1 not found: ID does not exist" containerID="593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.533707 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1"} err="failed to get container status \"593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1\": rpc error: code = NotFound desc = could not find container \"593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1\": container with ID starting with 593e161e20d4598c011ee266ab9cefc9b55e5e1727807f59ed352b4bd0a3b6a1 not found: ID does not exist" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.552678 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e61ed8df-7b9d-4253-9925-95ce4e6483af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.552780 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e61ed8df-7b9d-4253-9925-95ce4e6483af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.554027 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e61ed8df-7b9d-4253-9925-95ce4e6483af-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.569565 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6654475965-prz44"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.575184 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.576213 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.579300 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.579372 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.582765 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.591651 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e61ed8df-7b9d-4253-9925-95ce4e6483af-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.624379 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.654582 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bad70197-510e-4098-9083-1c3778374453-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.654668 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bad70197-510e-4098-9083-1c3778374453-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.756246 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bad70197-510e-4098-9083-1c3778374453-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.756302 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bad70197-510e-4098-9083-1c3778374453-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.756810 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bad70197-510e-4098-9083-1c3778374453-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.758225 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.778049 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bad70197-510e-4098-9083-1c3778374453-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.859351 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lnk65"] Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.914579 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:27:49 crc kubenswrapper[4849]: I0320 13:27:49.936598 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2fkf"] Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.276972 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:50 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:50 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:50 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.277206 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.348727 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.437486 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerStarted","Data":"cad2328bff9ce2cc589ad2c72705bc9dae22179fa1562cdbdb002233959d9ae3"} Mar 20 13:27:50 crc kubenswrapper[4849]: W0320 13:27:50.453973 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode61ed8df_7b9d_4253_9925_95ce4e6483af.slice/crio-d9fca5d28c2b7b9d396157e8aec12a02382193622fdb7b18710354cf64a4bd23 WatchSource:0}: Error finding container d9fca5d28c2b7b9d396157e8aec12a02382193622fdb7b18710354cf64a4bd23: Status 404 returned error can't find the container with id d9fca5d28c2b7b9d396157e8aec12a02382193622fdb7b18710354cf64a4bd23 Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.455108 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerStarted","Data":"97d757430a5f795d6d8571640532081ba58e5efb58d48dd62d9343e6e390f7d3"} Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.465026 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:27:50 crc kubenswrapper[4849]: W0320 13:27:50.528463 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbad70197_510e_4098_9083_1c3778374453.slice/crio-9f6c02061f55c62708ab23c6ceabffedb91f67c44fbd01a71e6e05b9da7476c6 WatchSource:0}: Error finding container 9f6c02061f55c62708ab23c6ceabffedb91f67c44fbd01a71e6e05b9da7476c6: Status 404 returned error can't find the container with id 9f6c02061f55c62708ab23c6ceabffedb91f67c44fbd01a71e6e05b9da7476c6 Mar 20 13:27:50 crc kubenswrapper[4849]: I0320 13:27:50.630861 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x"] Mar 20 13:27:50 crc kubenswrapper[4849]: W0320 13:27:50.679088 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab4a456d_a029_417e_9eb5_7fe5c9a1958d.slice/crio-ee4c386883f24d238223f407676e201dd12dce55c090b9dcc222020795808baa WatchSource:0}: Error finding container ee4c386883f24d238223f407676e201dd12dce55c090b9dcc222020795808baa: Status 404 returned error can't find the container with id ee4c386883f24d238223f407676e201dd12dce55c090b9dcc222020795808baa Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.060387 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122c80d4-7b66-4509-b228-f22910b0963a" path="/var/lib/kubelet/pods/122c80d4-7b66-4509-b228-f22910b0963a/volumes" Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.278465 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:51 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:51 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:51 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.278559 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.514275 4849 generic.go:334] "Generic (PLEG): container finished" podID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerID="8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0" exitCode=0 Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.514352 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerDied","Data":"8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0"} Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.520533 4849 generic.go:334] "Generic (PLEG): container finished" podID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerID="805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1" exitCode=0 Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.520662 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerDied","Data":"805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1"} Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.529295 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bad70197-510e-4098-9083-1c3778374453","Type":"ContainerStarted","Data":"9f6c02061f55c62708ab23c6ceabffedb91f67c44fbd01a71e6e05b9da7476c6"} Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.548942 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e61ed8df-7b9d-4253-9925-95ce4e6483af","Type":"ContainerStarted","Data":"d9fca5d28c2b7b9d396157e8aec12a02382193622fdb7b18710354cf64a4bd23"} Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.577956 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" event={"ID":"ab4a456d-a029-417e-9eb5-7fe5c9a1958d","Type":"ContainerStarted","Data":"4a3e002bb6b67f86c94720eb495ba5605603f26af96b55adabaafaf65aa40b89"} Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.578014 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" event={"ID":"ab4a456d-a029-417e-9eb5-7fe5c9a1958d","Type":"ContainerStarted","Data":"ee4c386883f24d238223f407676e201dd12dce55c090b9dcc222020795808baa"} Mar 20 13:27:51 crc kubenswrapper[4849]: I0320 13:27:51.578689 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.023214 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.048926 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" podStartSLOduration=6.04890829 podStartE2EDuration="6.04890829s" podCreationTimestamp="2026-03-20 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:51.607765016 +0000 UTC m=+221.285488411" watchObservedRunningTime="2026-03-20 13:27:52.04890829 +0000 UTC m=+221.726631685" Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.277196 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:52 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:52 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:52 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.277262 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.407884 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pmc7z" Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.619320 4849 generic.go:334] "Generic (PLEG): container finished" podID="e61ed8df-7b9d-4253-9925-95ce4e6483af" containerID="ca1bd08e0d9e9f028e9c42b519bbcdb093527135469b827d83bbb258aa577759" exitCode=0 Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.619726 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e61ed8df-7b9d-4253-9925-95ce4e6483af","Type":"ContainerDied","Data":"ca1bd08e0d9e9f028e9c42b519bbcdb093527135469b827d83bbb258aa577759"} Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.634755 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bad70197-510e-4098-9083-1c3778374453","Type":"ContainerStarted","Data":"798431d4caa3164628890629777d7997a43ed7da2ee6e7f7f148d847a6b760ae"} Mar 20 13:27:52 crc kubenswrapper[4849]: I0320 13:27:52.668411 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.668395808 podStartE2EDuration="3.668395808s" podCreationTimestamp="2026-03-20 13:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:52.667313978 +0000 UTC m=+222.345037393" watchObservedRunningTime="2026-03-20 13:27:52.668395808 +0000 UTC m=+222.346119203" Mar 20 13:27:53 crc kubenswrapper[4849]: I0320 13:27:53.275061 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:53 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:53 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:53 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:53 crc kubenswrapper[4849]: I0320 13:27:53.275124 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:53 crc kubenswrapper[4849]: I0320 13:27:53.676211 4849 generic.go:334] "Generic (PLEG): container finished" podID="bad70197-510e-4098-9083-1c3778374453" containerID="798431d4caa3164628890629777d7997a43ed7da2ee6e7f7f148d847a6b760ae" exitCode=0 Mar 20 13:27:53 crc kubenswrapper[4849]: I0320 13:27:53.676328 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bad70197-510e-4098-9083-1c3778374453","Type":"ContainerDied","Data":"798431d4caa3164628890629777d7997a43ed7da2ee6e7f7f148d847a6b760ae"} Mar 20 13:27:54 crc kubenswrapper[4849]: I0320 13:27:54.279146 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:54 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:54 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:54 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:54 crc kubenswrapper[4849]: I0320 13:27:54.279384 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:55 crc kubenswrapper[4849]: I0320 13:27:55.276364 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:55 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:55 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:55 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:55 crc kubenswrapper[4849]: I0320 13:27:55.276428 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:55 crc kubenswrapper[4849]: I0320 13:27:55.744643 4849 patch_prober.go:28] interesting pod/console-f9d7485db-ztzl5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 20 13:27:55 crc kubenswrapper[4849]: I0320 13:27:55.744710 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ztzl5" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 20 13:27:56 crc kubenswrapper[4849]: I0320 13:27:56.278647 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:56 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:56 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:56 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:56 crc kubenswrapper[4849]: I0320 13:27:56.278729 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:56 crc kubenswrapper[4849]: I0320 13:27:56.571403 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:56 crc kubenswrapper[4849]: I0320 13:27:56.571583 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:27:56 crc kubenswrapper[4849]: I0320 13:27:56.571731 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:56 crc kubenswrapper[4849]: I0320 13:27:56.571869 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:27:57 crc kubenswrapper[4849]: I0320 13:27:57.276068 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:57 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:57 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:57 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:57 crc kubenswrapper[4849]: I0320 13:27:57.276140 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:58 crc kubenswrapper[4849]: I0320 13:27:58.275975 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:58 crc kubenswrapper[4849]: [-]has-synced failed: reason withheld Mar 20 13:27:58 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:58 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:58 crc kubenswrapper[4849]: I0320 13:27:58.276038 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:27:59 crc kubenswrapper[4849]: I0320 13:27:59.002667 4849 ???:1] "http: TLS handshake error from 192.168.126.11:36500: no serving certificate available for the kubelet" Mar 20 13:27:59 crc kubenswrapper[4849]: I0320 13:27:59.278876 4849 patch_prober.go:28] interesting pod/router-default-5444994796-n5ktf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:27:59 crc kubenswrapper[4849]: [+]has-synced ok Mar 20 13:27:59 crc kubenswrapper[4849]: [+]process-running ok Mar 20 13:27:59 crc kubenswrapper[4849]: healthz check failed Mar 20 13:27:59 crc kubenswrapper[4849]: I0320 13:27:59.279274 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-n5ktf" podUID="79437aa6-d273-4649-ac1c-e8955b940576" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.139197 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566888-9ps6c"] Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.140596 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.145268 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-9ps6c"] Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.146872 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.181311 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gvj\" (UniqueName: \"kubernetes.io/projected/5b89502e-d430-490b-83df-7e4ba6393a51-kube-api-access-f8gvj\") pod \"auto-csr-approver-29566888-9ps6c\" (UID: \"5b89502e-d430-490b-83df-7e4ba6393a51\") " pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.276617 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.279846 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-n5ktf" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.282501 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gvj\" (UniqueName: \"kubernetes.io/projected/5b89502e-d430-490b-83df-7e4ba6393a51-kube-api-access-f8gvj\") pod \"auto-csr-approver-29566888-9ps6c\" (UID: \"5b89502e-d430-490b-83df-7e4ba6393a51\") " pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.305862 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gvj\" (UniqueName: \"kubernetes.io/projected/5b89502e-d430-490b-83df-7e4ba6393a51-kube-api-access-f8gvj\") pod \"auto-csr-approver-29566888-9ps6c\" (UID: \"5b89502e-d430-490b-83df-7e4ba6393a51\") " pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:00 crc kubenswrapper[4849]: I0320 13:28:00.461164 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:04 crc kubenswrapper[4849]: I0320 13:28:04.468473 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9c84796dd-7xmf7"] Mar 20 13:28:04 crc kubenswrapper[4849]: I0320 13:28:04.469047 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" podUID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" containerName="controller-manager" containerID="cri-o://2a2f9bd21e2ca3f104edb03c267f6465cbd3f1b192cd99635ee518165570c362" gracePeriod=30 Mar 20 13:28:04 crc kubenswrapper[4849]: I0320 13:28:04.492106 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x"] Mar 20 13:28:04 crc kubenswrapper[4849]: I0320 13:28:04.492337 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" podUID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" containerName="route-controller-manager" containerID="cri-o://4a3e002bb6b67f86c94720eb495ba5605603f26af96b55adabaafaf65aa40b89" gracePeriod=30 Mar 20 13:28:05 crc kubenswrapper[4849]: I0320 13:28:05.631944 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:28:05 crc kubenswrapper[4849]: I0320 13:28:05.750730 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:28:05 crc kubenswrapper[4849]: I0320 13:28:05.763115 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:28:05 crc kubenswrapper[4849]: I0320 13:28:05.885393 4849 generic.go:334] "Generic (PLEG): container finished" podID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" containerID="2a2f9bd21e2ca3f104edb03c267f6465cbd3f1b192cd99635ee518165570c362" exitCode=0 Mar 20 13:28:05 crc kubenswrapper[4849]: I0320 13:28:05.885475 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" event={"ID":"07555f73-9ea2-419b-9e3f-da6dba21c0e8","Type":"ContainerDied","Data":"2a2f9bd21e2ca3f104edb03c267f6465cbd3f1b192cd99635ee518165570c362"} Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.570269 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.570328 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.570272 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.570404 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.570485 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.571737 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"aaa54ce9016d8d52ca0588e79fd770dcff25f03ed4a79719df427a318d4e1e2f"} pod="openshift-console/downloads-7954f5f757-mwgqm" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.571781 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" containerID="cri-o://aaa54ce9016d8d52ca0588e79fd770dcff25f03ed4a79719df427a318d4e1e2f" gracePeriod=2 Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.571917 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.571953 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.609565 4849 patch_prober.go:28] interesting pod/controller-manager-9c84796dd-7xmf7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 20 13:28:06 crc kubenswrapper[4849]: I0320 13:28:06.610228 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" podUID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 20 13:28:09 crc kubenswrapper[4849]: I0320 13:28:09.385152 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:28:09 crc kubenswrapper[4849]: I0320 13:28:09.385217 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:28:09 crc kubenswrapper[4849]: I0320 13:28:09.627266 4849 patch_prober.go:28] interesting pod/route-controller-manager-59d8f7b698-vl82x container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Mar 20 13:28:09 crc kubenswrapper[4849]: I0320 13:28:09.627342 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" podUID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4849]: I0320 13:28:11.920290 4849 generic.go:334] "Generic (PLEG): container finished" podID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" containerID="4a3e002bb6b67f86c94720eb495ba5605603f26af96b55adabaafaf65aa40b89" exitCode=0 Mar 20 13:28:11 crc kubenswrapper[4849]: I0320 13:28:11.920379 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" event={"ID":"ab4a456d-a029-417e-9eb5-7fe5c9a1958d","Type":"ContainerDied","Data":"4a3e002bb6b67f86c94720eb495ba5605603f26af96b55adabaafaf65aa40b89"} Mar 20 13:28:11 crc kubenswrapper[4849]: I0320 13:28:11.922970 4849 generic.go:334] "Generic (PLEG): container finished" podID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerID="aaa54ce9016d8d52ca0588e79fd770dcff25f03ed4a79719df427a318d4e1e2f" exitCode=0 Mar 20 13:28:11 crc kubenswrapper[4849]: I0320 13:28:11.923005 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwgqm" event={"ID":"e0a6353b-f7df-4ef2-b5c0-e52f35646aba","Type":"ContainerDied","Data":"aaa54ce9016d8d52ca0588e79fd770dcff25f03ed4a79719df427a318d4e1e2f"} Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.834359 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.844153 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.860299 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b58b86f48-h8hmj"] Mar 20 13:28:13 crc kubenswrapper[4849]: E0320 13:28:13.860975 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad70197-510e-4098-9083-1c3778374453" containerName="pruner" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.861003 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad70197-510e-4098-9083-1c3778374453" containerName="pruner" Mar 20 13:28:13 crc kubenswrapper[4849]: E0320 13:28:13.861016 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" containerName="controller-manager" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.861037 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" containerName="controller-manager" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.861174 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad70197-510e-4098-9083-1c3778374453" containerName="pruner" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.861197 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" containerName="controller-manager" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.861652 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.886054 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b58b86f48-h8hmj"] Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928657 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxlpx\" (UniqueName: \"kubernetes.io/projected/07555f73-9ea2-419b-9e3f-da6dba21c0e8-kube-api-access-kxlpx\") pod \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928732 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bad70197-510e-4098-9083-1c3778374453-kubelet-dir\") pod \"bad70197-510e-4098-9083-1c3778374453\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928767 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bad70197-510e-4098-9083-1c3778374453-kube-api-access\") pod \"bad70197-510e-4098-9083-1c3778374453\" (UID: \"bad70197-510e-4098-9083-1c3778374453\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928838 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07555f73-9ea2-419b-9e3f-da6dba21c0e8-serving-cert\") pod \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928889 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-client-ca\") pod \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928918 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-config\") pod \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.928955 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-proxy-ca-bundles\") pod \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\" (UID: \"07555f73-9ea2-419b-9e3f-da6dba21c0e8\") " Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.929123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-proxy-ca-bundles\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.929161 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvqp2\" (UniqueName: \"kubernetes.io/projected/6b97f0a6-160b-4655-b3da-005f1b66e7ef-kube-api-access-rvqp2\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.929183 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b97f0a6-160b-4655-b3da-005f1b66e7ef-serving-cert\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.929286 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-client-ca\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.929343 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-config\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.929443 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bad70197-510e-4098-9083-1c3778374453-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bad70197-510e-4098-9083-1c3778374453" (UID: "bad70197-510e-4098-9083-1c3778374453"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.930483 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "07555f73-9ea2-419b-9e3f-da6dba21c0e8" (UID: "07555f73-9ea2-419b-9e3f-da6dba21c0e8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.930687 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-config" (OuterVolumeSpecName: "config") pod "07555f73-9ea2-419b-9e3f-da6dba21c0e8" (UID: "07555f73-9ea2-419b-9e3f-da6dba21c0e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.931234 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-client-ca" (OuterVolumeSpecName: "client-ca") pod "07555f73-9ea2-419b-9e3f-da6dba21c0e8" (UID: "07555f73-9ea2-419b-9e3f-da6dba21c0e8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.940011 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad70197-510e-4098-9083-1c3778374453-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bad70197-510e-4098-9083-1c3778374453" (UID: "bad70197-510e-4098-9083-1c3778374453"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.940106 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bad70197-510e-4098-9083-1c3778374453","Type":"ContainerDied","Data":"9f6c02061f55c62708ab23c6ceabffedb91f67c44fbd01a71e6e05b9da7476c6"} Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.940139 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f6c02061f55c62708ab23c6ceabffedb91f67c44fbd01a71e6e05b9da7476c6" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.940141 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.940472 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07555f73-9ea2-419b-9e3f-da6dba21c0e8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07555f73-9ea2-419b-9e3f-da6dba21c0e8" (UID: "07555f73-9ea2-419b-9e3f-da6dba21c0e8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.941863 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" event={"ID":"07555f73-9ea2-419b-9e3f-da6dba21c0e8","Type":"ContainerDied","Data":"dda8c231db34d886bf81984497879b234ac4df7ccfb6b47acec7891d42151630"} Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.941894 4849 scope.go:117] "RemoveContainer" containerID="2a2f9bd21e2ca3f104edb03c267f6465cbd3f1b192cd99635ee518165570c362" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.941943 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9c84796dd-7xmf7" Mar 20 13:28:13 crc kubenswrapper[4849]: I0320 13:28:13.942309 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07555f73-9ea2-419b-9e3f-da6dba21c0e8-kube-api-access-kxlpx" (OuterVolumeSpecName: "kube-api-access-kxlpx") pod "07555f73-9ea2-419b-9e3f-da6dba21c0e8" (UID: "07555f73-9ea2-419b-9e3f-da6dba21c0e8"). InnerVolumeSpecName "kube-api-access-kxlpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030634 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-config\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030697 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-proxy-ca-bundles\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030724 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvqp2\" (UniqueName: \"kubernetes.io/projected/6b97f0a6-160b-4655-b3da-005f1b66e7ef-kube-api-access-rvqp2\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030744 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b97f0a6-160b-4655-b3da-005f1b66e7ef-serving-cert\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030780 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-client-ca\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030859 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030870 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030879 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxlpx\" (UniqueName: \"kubernetes.io/projected/07555f73-9ea2-419b-9e3f-da6dba21c0e8-kube-api-access-kxlpx\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030887 4849 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bad70197-510e-4098-9083-1c3778374453-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030895 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bad70197-510e-4098-9083-1c3778374453-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030905 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07555f73-9ea2-419b-9e3f-da6dba21c0e8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.030912 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07555f73-9ea2-419b-9e3f-da6dba21c0e8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.032778 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-client-ca\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.033056 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-config\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.033536 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-proxy-ca-bundles\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.037381 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b97f0a6-160b-4655-b3da-005f1b66e7ef-serving-cert\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.049091 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvqp2\" (UniqueName: \"kubernetes.io/projected/6b97f0a6-160b-4655-b3da-005f1b66e7ef-kube-api-access-rvqp2\") pod \"controller-manager-7b58b86f48-h8hmj\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.185692 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.289467 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9c84796dd-7xmf7"] Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.292603 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9c84796dd-7xmf7"] Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.362349 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.462026 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-client-ca\") pod \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.462075 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nntvq\" (UniqueName: \"kubernetes.io/projected/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-kube-api-access-nntvq\") pod \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.462155 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-serving-cert\") pod \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.462269 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-config\") pod \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\" (UID: \"ab4a456d-a029-417e-9eb5-7fe5c9a1958d\") " Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.462976 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab4a456d-a029-417e-9eb5-7fe5c9a1958d" (UID: "ab4a456d-a029-417e-9eb5-7fe5c9a1958d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.463641 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-config" (OuterVolumeSpecName: "config") pod "ab4a456d-a029-417e-9eb5-7fe5c9a1958d" (UID: "ab4a456d-a029-417e-9eb5-7fe5c9a1958d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.472931 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab4a456d-a029-417e-9eb5-7fe5c9a1958d" (UID: "ab4a456d-a029-417e-9eb5-7fe5c9a1958d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.473121 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-kube-api-access-nntvq" (OuterVolumeSpecName: "kube-api-access-nntvq") pod "ab4a456d-a029-417e-9eb5-7fe5c9a1958d" (UID: "ab4a456d-a029-417e-9eb5-7fe5c9a1958d"). InnerVolumeSpecName "kube-api-access-nntvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.564522 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.564578 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.564595 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nntvq\" (UniqueName: \"kubernetes.io/projected/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-kube-api-access-nntvq\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.564608 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab4a456d-a029-417e-9eb5-7fe5c9a1958d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.950330 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e61ed8df-7b9d-4253-9925-95ce4e6483af","Type":"ContainerDied","Data":"d9fca5d28c2b7b9d396157e8aec12a02382193622fdb7b18710354cf64a4bd23"} Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.950392 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9fca5d28c2b7b9d396157e8aec12a02382193622fdb7b18710354cf64a4bd23" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.956114 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" event={"ID":"ab4a456d-a029-417e-9eb5-7fe5c9a1958d","Type":"ContainerDied","Data":"ee4c386883f24d238223f407676e201dd12dce55c090b9dcc222020795808baa"} Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.956250 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x" Mar 20 13:28:14 crc kubenswrapper[4849]: I0320 13:28:14.973274 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.008103 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x"] Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.010986 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59d8f7b698-vl82x"] Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.046092 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07555f73-9ea2-419b-9e3f-da6dba21c0e8" path="/var/lib/kubelet/pods/07555f73-9ea2-419b-9e3f-da6dba21c0e8/volumes" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.047083 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" path="/var/lib/kubelet/pods/ab4a456d-a029-417e-9eb5-7fe5c9a1958d/volumes" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.071377 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e61ed8df-7b9d-4253-9925-95ce4e6483af-kube-api-access\") pod \"e61ed8df-7b9d-4253-9925-95ce4e6483af\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.074277 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e61ed8df-7b9d-4253-9925-95ce4e6483af-kubelet-dir\") pod \"e61ed8df-7b9d-4253-9925-95ce4e6483af\" (UID: \"e61ed8df-7b9d-4253-9925-95ce4e6483af\") " Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.074402 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e61ed8df-7b9d-4253-9925-95ce4e6483af-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e61ed8df-7b9d-4253-9925-95ce4e6483af" (UID: "e61ed8df-7b9d-4253-9925-95ce4e6483af"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.074925 4849 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e61ed8df-7b9d-4253-9925-95ce4e6483af-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.082510 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e61ed8df-7b9d-4253-9925-95ce4e6483af-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e61ed8df-7b9d-4253-9925-95ce4e6483af" (UID: "e61ed8df-7b9d-4253-9925-95ce4e6483af"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.176015 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e61ed8df-7b9d-4253-9925-95ce4e6483af-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:15 crc kubenswrapper[4849]: I0320 13:28:15.961478 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.291321 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf"] Mar 20 13:28:16 crc kubenswrapper[4849]: E0320 13:28:16.308698 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e61ed8df-7b9d-4253-9925-95ce4e6483af" containerName="pruner" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.308742 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e61ed8df-7b9d-4253-9925-95ce4e6483af" containerName="pruner" Mar 20 13:28:16 crc kubenswrapper[4849]: E0320 13:28:16.308769 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" containerName="route-controller-manager" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.308777 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" containerName="route-controller-manager" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.309190 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab4a456d-a029-417e-9eb5-7fe5c9a1958d" containerName="route-controller-manager" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.309215 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e61ed8df-7b9d-4253-9925-95ce4e6483af" containerName="pruner" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.310042 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.314693 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.315640 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf"] Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.316935 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.317139 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.317274 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.317417 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.320239 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.411036 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-config\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.411138 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-serving-cert\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.411183 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-client-ca\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.411311 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5qm\" (UniqueName: \"kubernetes.io/projected/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-kube-api-access-ng5qm\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.512690 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-serving-cert\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.512759 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-client-ca\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.512783 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5qm\" (UniqueName: \"kubernetes.io/projected/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-kube-api-access-ng5qm\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.512808 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-config\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.513843 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-client-ca\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.514044 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-config\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.521320 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-serving-cert\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.541331 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5qm\" (UniqueName: \"kubernetes.io/projected/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-kube-api-access-ng5qm\") pod \"route-controller-manager-6444f96cbd-2qhtf\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.570712 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.570769 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:16 crc kubenswrapper[4849]: I0320 13:28:16.634255 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:16 crc kubenswrapper[4849]: E0320 13:28:16.772545 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:28:16 crc kubenswrapper[4849]: E0320 13:28:16.773080 4849 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:28:16 crc kubenswrapper[4849]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:28:16 crc kubenswrapper[4849]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4wmj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566886-7cjjt_openshift-infra(4855b8cf-a062-487c-bf23-49fd7f919e7a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:28:16 crc kubenswrapper[4849]: > logger="UnhandledError" Mar 20 13:28:16 crc kubenswrapper[4849]: E0320 13:28:16.775361 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" podUID="4855b8cf-a062-487c-bf23-49fd7f919e7a" Mar 20 13:28:16 crc kubenswrapper[4849]: E0320 13:28:16.977977 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" podUID="4855b8cf-a062-487c-bf23-49fd7f919e7a" Mar 20 13:28:17 crc kubenswrapper[4849]: I0320 13:28:17.064975 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:28:17 crc kubenswrapper[4849]: I0320 13:28:17.190322 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-9ps6c"] Mar 20 13:28:17 crc kubenswrapper[4849]: I0320 13:28:17.343891 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ccw9b" Mar 20 13:28:20 crc kubenswrapper[4849]: I0320 13:28:20.168842 4849 scope.go:117] "RemoveContainer" containerID="4a3e002bb6b67f86c94720eb495ba5605603f26af96b55adabaafaf65aa40b89" Mar 20 13:28:21 crc kubenswrapper[4849]: I0320 13:28:21.000801 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" event={"ID":"5b89502e-d430-490b-83df-7e4ba6393a51","Type":"ContainerStarted","Data":"3c7e1e1d02137f44217f310530380f9b7d00aa1227fb7422c19b70fd8d603447"} Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.547235 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.548831 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.551921 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.552171 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.557542 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.711351 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.711522 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.813532 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.813623 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.813695 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.839101 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:22 crc kubenswrapper[4849]: I0320 13:28:22.896461 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:24 crc kubenswrapper[4849]: I0320 13:28:24.445537 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b58b86f48-h8hmj"] Mar 20 13:28:24 crc kubenswrapper[4849]: I0320 13:28:24.549095 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf"] Mar 20 13:28:24 crc kubenswrapper[4849]: I0320 13:28:24.820609 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfr7p"] Mar 20 13:28:25 crc kubenswrapper[4849]: E0320 13:28:25.775296 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:28:25 crc kubenswrapper[4849]: E0320 13:28:25.776032 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qdlp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-xx4fv_openshift-marketplace(63553d28-5dba-492e-b004-043ea30ee635): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:28:25 crc kubenswrapper[4849]: E0320 13:28:25.777603 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-xx4fv" podUID="63553d28-5dba-492e-b004-043ea30ee635" Mar 20 13:28:25 crc kubenswrapper[4849]: E0320 13:28:25.822045 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:28:25 crc kubenswrapper[4849]: E0320 13:28:25.822296 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xf4g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5zspj_openshift-marketplace(5e607d4a-4c18-4de3-9b29-c5f32fadee50): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:28:25 crc kubenswrapper[4849]: E0320 13:28:25.823800 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5zspj" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.570473 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.570544 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.954482 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.955691 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.958725 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.994353 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-var-lock\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.994505 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:26 crc kubenswrapper[4849]: I0320 13:28:26.994592 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kube-api-access\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.095286 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kube-api-access\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.095354 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-var-lock\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.095411 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.095413 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-var-lock\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.095464 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.115531 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kube-api-access\") pod \"installer-9-crc\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:27 crc kubenswrapper[4849]: I0320 13:28:27.285007 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:30 crc kubenswrapper[4849]: E0320 13:28:30.662166 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-xx4fv" podUID="63553d28-5dba-492e-b004-043ea30ee635" Mar 20 13:28:30 crc kubenswrapper[4849]: E0320 13:28:30.662220 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5zspj" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" Mar 20 13:28:30 crc kubenswrapper[4849]: E0320 13:28:30.750916 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:28:30 crc kubenswrapper[4849]: E0320 13:28:30.751102 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4ndjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v2fkf_openshift-marketplace(ae23d7db-4e32-4c07-ae0a-19dd8ac82a10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:28:30 crc kubenswrapper[4849]: E0320 13:28:30.752303 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v2fkf" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.197241 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v2fkf" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.289303 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.289483 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nbb4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6zwxp_openshift-marketplace(ee7ffb06-f91c-4469-9c5d-ee0a4296c805): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.290637 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6zwxp" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.314564 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.314861 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xm282,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lnk65_openshift-marketplace(02c87e15-4f0c-422f-812b-5a4bcbf1b639): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.316043 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lnk65" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.320568 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.320665 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vvpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dgk97_openshift-marketplace(b7e8bcae-39ef-4786-b2b8-18dea74380fa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:28:32 crc kubenswrapper[4849]: E0320 13:28:32.322848 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dgk97" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" Mar 20 13:28:32 crc kubenswrapper[4849]: I0320 13:28:32.475925 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b58b86f48-h8hmj"] Mar 20 13:28:32 crc kubenswrapper[4849]: I0320 13:28:32.636806 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf"] Mar 20 13:28:32 crc kubenswrapper[4849]: I0320 13:28:32.747554 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:28:32 crc kubenswrapper[4849]: I0320 13:28:32.853586 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.086322 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" event={"ID":"6b97f0a6-160b-4655-b3da-005f1b66e7ef","Type":"ContainerStarted","Data":"278a8ef39b5808b87483d55481f2222b3fe11dd07ddef4ae4bcf4e3c220ba755"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.086372 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" event={"ID":"6b97f0a6-160b-4655-b3da-005f1b66e7ef","Type":"ContainerStarted","Data":"f3d291bf7d722e3de49ea41c4c6914ee8c69a729a1ae7d77f07ac2ea5943eb5b"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.086498 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" podUID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" containerName="controller-manager" containerID="cri-o://278a8ef39b5808b87483d55481f2222b3fe11dd07ddef4ae4bcf4e3c220ba755" gracePeriod=30 Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.087286 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.111589 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" podStartSLOduration=29.111556409 podStartE2EDuration="29.111556409s" podCreationTimestamp="2026-03-20 13:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:33.108178936 +0000 UTC m=+262.785902341" watchObservedRunningTime="2026-03-20 13:28:33.111556409 +0000 UTC m=+262.789279804" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.116145 4849 generic.go:334] "Generic (PLEG): container finished" podID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerID="809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2" exitCode=0 Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.116256 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft4dw" event={"ID":"b7396166-d1a2-4565-8ccc-3ed06ce215f4","Type":"ContainerDied","Data":"809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.129289 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.129422 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerStarted","Data":"975f4e3ad717bd86822dc1225bf93aeea634d7107c6aaffaecdf2689d11fa64c"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.136757 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" event={"ID":"2fe5d2cf-cadf-48ce-9bed-549e618bec5b","Type":"ContainerStarted","Data":"865b9d7f8d7a637c967f1e55fdd353246d99700554b22a6c0972bccdfbd6ee13"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.136800 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" event={"ID":"2fe5d2cf-cadf-48ce-9bed-549e618bec5b","Type":"ContainerStarted","Data":"128342cd034008ebb189b72f2aaaa23b4595ef7dcbd6f0c4ccc0a1289d1b7ea3"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.136943 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" podUID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" containerName="route-controller-manager" containerID="cri-o://865b9d7f8d7a637c967f1e55fdd353246d99700554b22a6c0972bccdfbd6ee13" gracePeriod=30 Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.137353 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.163431 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f","Type":"ContainerStarted","Data":"cf7818c813d28aebb56832dc38c78b783074c33d88d2e6880381c33517cb3c6f"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.166779 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e5ebd6cd-36b8-4c55-a33f-442885c800c3","Type":"ContainerStarted","Data":"0a4bf5a304d9ac624cf7eb5afefcd5235f5ebc5ca8f684e87b72a73c39c6e0e8"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.171861 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" event={"ID":"4855b8cf-a062-487c-bf23-49fd7f919e7a","Type":"ContainerStarted","Data":"d5e5dc27ea89bc2e39a9b38a4fe842de9ec1359c142d55a265b10eca6c9dc696"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.185712 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mwgqm" event={"ID":"e0a6353b-f7df-4ef2-b5c0-e52f35646aba","Type":"ContainerStarted","Data":"9e4e9884f8597059fec26a50324f032e429962a9c44a1e2338b4227570a46b8c"} Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.185796 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.188050 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.188108 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.188347 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" event={"ID":"5b89502e-d430-490b-83df-7e4ba6393a51","Type":"ContainerStarted","Data":"15d66e4d503a600034c2181c1e5e6d59b92ac9ebd8ca07e95f2c186ab36b47b1"} Mar 20 13:28:33 crc kubenswrapper[4849]: E0320 13:28:33.190244 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lnk65" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" Mar 20 13:28:33 crc kubenswrapper[4849]: E0320 13:28:33.191427 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dgk97" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" Mar 20 13:28:33 crc kubenswrapper[4849]: E0320 13:28:33.191545 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6zwxp" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.209845 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" podStartSLOduration=29.209789925 podStartE2EDuration="29.209789925s" podCreationTimestamp="2026-03-20 13:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:33.196885359 +0000 UTC m=+262.874608774" watchObservedRunningTime="2026-03-20 13:28:33.209789925 +0000 UTC m=+262.887513320" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.256245 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" podStartSLOduration=21.076300405 podStartE2EDuration="33.256225994s" podCreationTimestamp="2026-03-20 13:28:00 +0000 UTC" firstStartedPulling="2026-03-20 13:28:20.197838313 +0000 UTC m=+249.875561708" lastFinishedPulling="2026-03-20 13:28:32.377763902 +0000 UTC m=+262.055487297" observedRunningTime="2026-03-20 13:28:33.253607902 +0000 UTC m=+262.931331287" watchObservedRunningTime="2026-03-20 13:28:33.256225994 +0000 UTC m=+262.933949389" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.294009 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" podStartSLOduration=100.95572126 podStartE2EDuration="2m33.293986664s" podCreationTimestamp="2026-03-20 13:26:00 +0000 UTC" firstStartedPulling="2026-03-20 13:27:40.060935109 +0000 UTC m=+209.738658504" lastFinishedPulling="2026-03-20 13:28:32.399200513 +0000 UTC m=+262.076923908" observedRunningTime="2026-03-20 13:28:33.288568175 +0000 UTC m=+262.966291570" watchObservedRunningTime="2026-03-20 13:28:33.293986664 +0000 UTC m=+262.971710059" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.412362 4849 patch_prober.go:28] interesting pod/route-controller-manager-6444f96cbd-2qhtf container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:40864->10.217.0.61:8443: read: connection reset by peer" start-of-body= Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.412434 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" podUID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:40864->10.217.0.61:8443: read: connection reset by peer" Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.620992 4849 csr.go:261] certificate signing request csr-6cqrn is approved, waiting to be issued Mar 20 13:28:33 crc kubenswrapper[4849]: I0320 13:28:33.630911 4849 csr.go:257] certificate signing request csr-6cqrn is issued Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.212103 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6444f96cbd-2qhtf_2fe5d2cf-cadf-48ce-9bed-549e618bec5b/route-controller-manager/0.log" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.212365 4849 generic.go:334] "Generic (PLEG): container finished" podID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" containerID="865b9d7f8d7a637c967f1e55fdd353246d99700554b22a6c0972bccdfbd6ee13" exitCode=255 Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.212419 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" event={"ID":"2fe5d2cf-cadf-48ce-9bed-549e618bec5b","Type":"ContainerDied","Data":"865b9d7f8d7a637c967f1e55fdd353246d99700554b22a6c0972bccdfbd6ee13"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.213870 4849 generic.go:334] "Generic (PLEG): container finished" podID="4855b8cf-a062-487c-bf23-49fd7f919e7a" containerID="d5e5dc27ea89bc2e39a9b38a4fe842de9ec1359c142d55a265b10eca6c9dc696" exitCode=0 Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.213947 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" event={"ID":"4855b8cf-a062-487c-bf23-49fd7f919e7a","Type":"ContainerDied","Data":"d5e5dc27ea89bc2e39a9b38a4fe842de9ec1359c142d55a265b10eca6c9dc696"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.215281 4849 generic.go:334] "Generic (PLEG): container finished" podID="5b89502e-d430-490b-83df-7e4ba6393a51" containerID="15d66e4d503a600034c2181c1e5e6d59b92ac9ebd8ca07e95f2c186ab36b47b1" exitCode=0 Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.215328 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" event={"ID":"5b89502e-d430-490b-83df-7e4ba6393a51","Type":"ContainerDied","Data":"15d66e4d503a600034c2181c1e5e6d59b92ac9ebd8ca07e95f2c186ab36b47b1"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.217273 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f","Type":"ContainerStarted","Data":"fbfd54f0eac0824da61fd35f955fe56e4b67daf5f785ef195cceaf24882710f3"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.220669 4849 generic.go:334] "Generic (PLEG): container finished" podID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" containerID="278a8ef39b5808b87483d55481f2222b3fe11dd07ddef4ae4bcf4e3c220ba755" exitCode=0 Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.220718 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" event={"ID":"6b97f0a6-160b-4655-b3da-005f1b66e7ef","Type":"ContainerDied","Data":"278a8ef39b5808b87483d55481f2222b3fe11dd07ddef4ae4bcf4e3c220ba755"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.233086 4849 generic.go:334] "Generic (PLEG): container finished" podID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerID="975f4e3ad717bd86822dc1225bf93aeea634d7107c6aaffaecdf2689d11fa64c" exitCode=0 Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.233188 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerDied","Data":"975f4e3ad717bd86822dc1225bf93aeea634d7107c6aaffaecdf2689d11fa64c"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.246763 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.246837 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.246885 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e5ebd6cd-36b8-4c55-a33f-442885c800c3","Type":"ContainerStarted","Data":"c8532dabc515123c220d97782867f7f0fac3616731f86a008e6a9add465f27f3"} Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.284477 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=12.284457193 podStartE2EDuration="12.284457193s" podCreationTimestamp="2026-03-20 13:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:34.259972179 +0000 UTC m=+263.937695574" watchObservedRunningTime="2026-03-20 13:28:34.284457193 +0000 UTC m=+263.962180588" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.287087 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.287077495 podStartE2EDuration="8.287077495s" podCreationTimestamp="2026-03-20 13:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:34.286666914 +0000 UTC m=+263.964390309" watchObservedRunningTime="2026-03-20 13:28:34.287077495 +0000 UTC m=+263.964800890" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.437097 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.463905 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m"] Mar 20 13:28:34 crc kubenswrapper[4849]: E0320 13:28:34.464202 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" containerName="controller-manager" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.464218 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" containerName="controller-manager" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.464344 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" containerName="controller-manager" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.465408 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.497211 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m"] Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.538651 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6444f96cbd-2qhtf_2fe5d2cf-cadf-48ce-9bed-549e618bec5b/route-controller-manager/0.log" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.538722 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.605148 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-client-ca\") pod \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.605249 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-config\") pod \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606033 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-client-ca" (OuterVolumeSpecName: "client-ca") pod "2fe5d2cf-cadf-48ce-9bed-549e618bec5b" (UID: "2fe5d2cf-cadf-48ce-9bed-549e618bec5b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606271 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng5qm\" (UniqueName: \"kubernetes.io/projected/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-kube-api-access-ng5qm\") pod \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606326 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvqp2\" (UniqueName: \"kubernetes.io/projected/6b97f0a6-160b-4655-b3da-005f1b66e7ef-kube-api-access-rvqp2\") pod \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606349 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-config\") pod \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606397 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-proxy-ca-bundles\") pod \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606437 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-serving-cert\") pod \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\" (UID: \"2fe5d2cf-cadf-48ce-9bed-549e618bec5b\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606470 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b97f0a6-160b-4655-b3da-005f1b66e7ef-serving-cert\") pod \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.606487 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-client-ca\") pod \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\" (UID: \"6b97f0a6-160b-4655-b3da-005f1b66e7ef\") " Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.607418 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-client-ca" (OuterVolumeSpecName: "client-ca") pod "6b97f0a6-160b-4655-b3da-005f1b66e7ef" (UID: "6b97f0a6-160b-4655-b3da-005f1b66e7ef"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.607666 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-proxy-ca-bundles\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.607460 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-config" (OuterVolumeSpecName: "config") pod "6b97f0a6-160b-4655-b3da-005f1b66e7ef" (UID: "6b97f0a6-160b-4655-b3da-005f1b66e7ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.607494 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6b97f0a6-160b-4655-b3da-005f1b66e7ef" (UID: "6b97f0a6-160b-4655-b3da-005f1b66e7ef"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.607507 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-config" (OuterVolumeSpecName: "config") pod "2fe5d2cf-cadf-48ce-9bed-549e618bec5b" (UID: "2fe5d2cf-cadf-48ce-9bed-549e618bec5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608412 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99301b48-4cf1-41aa-bbb8-46166e443369-serving-cert\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608598 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-config\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608670 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-client-ca\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608701 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvvl\" (UniqueName: \"kubernetes.io/projected/99301b48-4cf1-41aa-bbb8-46166e443369-kube-api-access-mvvvl\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608890 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608913 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608922 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608931 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.608940 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b97f0a6-160b-4655-b3da-005f1b66e7ef-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.619055 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-kube-api-access-ng5qm" (OuterVolumeSpecName: "kube-api-access-ng5qm") pod "2fe5d2cf-cadf-48ce-9bed-549e618bec5b" (UID: "2fe5d2cf-cadf-48ce-9bed-549e618bec5b"). InnerVolumeSpecName "kube-api-access-ng5qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.619077 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2fe5d2cf-cadf-48ce-9bed-549e618bec5b" (UID: "2fe5d2cf-cadf-48ce-9bed-549e618bec5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.619267 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b97f0a6-160b-4655-b3da-005f1b66e7ef-kube-api-access-rvqp2" (OuterVolumeSpecName: "kube-api-access-rvqp2") pod "6b97f0a6-160b-4655-b3da-005f1b66e7ef" (UID: "6b97f0a6-160b-4655-b3da-005f1b66e7ef"). InnerVolumeSpecName "kube-api-access-rvqp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.619353 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b97f0a6-160b-4655-b3da-005f1b66e7ef-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6b97f0a6-160b-4655-b3da-005f1b66e7ef" (UID: "6b97f0a6-160b-4655-b3da-005f1b66e7ef"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.632309 4849 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-20 02:10:56.031102285 +0000 UTC Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.632338 4849 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7332h42m21.398767707s for next certificate rotation Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709341 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-proxy-ca-bundles\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709394 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99301b48-4cf1-41aa-bbb8-46166e443369-serving-cert\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709428 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-config\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709457 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-client-ca\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709472 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvvl\" (UniqueName: \"kubernetes.io/projected/99301b48-4cf1-41aa-bbb8-46166e443369-kube-api-access-mvvvl\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709526 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709535 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b97f0a6-160b-4655-b3da-005f1b66e7ef-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709545 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng5qm\" (UniqueName: \"kubernetes.io/projected/2fe5d2cf-cadf-48ce-9bed-549e618bec5b-kube-api-access-ng5qm\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.709554 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvqp2\" (UniqueName: \"kubernetes.io/projected/6b97f0a6-160b-4655-b3da-005f1b66e7ef-kube-api-access-rvqp2\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.711386 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-client-ca\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.711661 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-proxy-ca-bundles\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.711747 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-config\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.712745 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99301b48-4cf1-41aa-bbb8-46166e443369-serving-cert\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.724250 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvvl\" (UniqueName: \"kubernetes.io/projected/99301b48-4cf1-41aa-bbb8-46166e443369-kube-api-access-mvvvl\") pod \"controller-manager-5bf7f8c6cf-sc97m\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:34 crc kubenswrapper[4849]: I0320 13:28:34.779889 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.198797 4849 patch_prober.go:28] interesting pod/controller-manager-7b58b86f48-h8hmj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.199284 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" podUID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.253749 4849 generic.go:334] "Generic (PLEG): container finished" podID="bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f" containerID="fbfd54f0eac0824da61fd35f955fe56e4b67daf5f785ef195cceaf24882710f3" exitCode=0 Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.253796 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f","Type":"ContainerDied","Data":"fbfd54f0eac0824da61fd35f955fe56e4b67daf5f785ef195cceaf24882710f3"} Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.255680 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" event={"ID":"6b97f0a6-160b-4655-b3da-005f1b66e7ef","Type":"ContainerDied","Data":"f3d291bf7d722e3de49ea41c4c6914ee8c69a729a1ae7d77f07ac2ea5943eb5b"} Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.255708 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b58b86f48-h8hmj" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.255717 4849 scope.go:117] "RemoveContainer" containerID="278a8ef39b5808b87483d55481f2222b3fe11dd07ddef4ae4bcf4e3c220ba755" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.258620 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6444f96cbd-2qhtf_2fe5d2cf-cadf-48ce-9bed-549e618bec5b/route-controller-manager/0.log" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.258723 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" event={"ID":"2fe5d2cf-cadf-48ce-9bed-549e618bec5b","Type":"ContainerDied","Data":"128342cd034008ebb189b72f2aaaa23b4595ef7dcbd6f0c4ccc0a1289d1b7ea3"} Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.258895 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.284232 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf"] Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.288874 4849 scope.go:117] "RemoveContainer" containerID="865b9d7f8d7a637c967f1e55fdd353246d99700554b22a6c0972bccdfbd6ee13" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.291883 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444f96cbd-2qhtf"] Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.298036 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b58b86f48-h8hmj"] Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.304273 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b58b86f48-h8hmj"] Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.454423 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m"] Mar 20 13:28:35 crc kubenswrapper[4849]: W0320 13:28:35.482279 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99301b48_4cf1_41aa_bbb8_46166e443369.slice/crio-e36b53226ae0927ca1688f1bca8d527f55c87b9590f6afc62015a029b62b1428 WatchSource:0}: Error finding container e36b53226ae0927ca1688f1bca8d527f55c87b9590f6afc62015a029b62b1428: Status 404 returned error can't find the container with id e36b53226ae0927ca1688f1bca8d527f55c87b9590f6afc62015a029b62b1428 Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.632791 4849 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-23 08:56:11.270868902 +0000 UTC Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.633129 4849 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6667h27m35.637743111s for next certificate rotation Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.646987 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.685462 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.727504 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wmj4\" (UniqueName: \"kubernetes.io/projected/4855b8cf-a062-487c-bf23-49fd7f919e7a-kube-api-access-4wmj4\") pod \"4855b8cf-a062-487c-bf23-49fd7f919e7a\" (UID: \"4855b8cf-a062-487c-bf23-49fd7f919e7a\") " Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.727632 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8gvj\" (UniqueName: \"kubernetes.io/projected/5b89502e-d430-490b-83df-7e4ba6393a51-kube-api-access-f8gvj\") pod \"5b89502e-d430-490b-83df-7e4ba6393a51\" (UID: \"5b89502e-d430-490b-83df-7e4ba6393a51\") " Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.733606 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b89502e-d430-490b-83df-7e4ba6393a51-kube-api-access-f8gvj" (OuterVolumeSpecName: "kube-api-access-f8gvj") pod "5b89502e-d430-490b-83df-7e4ba6393a51" (UID: "5b89502e-d430-490b-83df-7e4ba6393a51"). InnerVolumeSpecName "kube-api-access-f8gvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.734019 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4855b8cf-a062-487c-bf23-49fd7f919e7a-kube-api-access-4wmj4" (OuterVolumeSpecName: "kube-api-access-4wmj4") pod "4855b8cf-a062-487c-bf23-49fd7f919e7a" (UID: "4855b8cf-a062-487c-bf23-49fd7f919e7a"). InnerVolumeSpecName "kube-api-access-4wmj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.829611 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8gvj\" (UniqueName: \"kubernetes.io/projected/5b89502e-d430-490b-83df-7e4ba6393a51-kube-api-access-f8gvj\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:35 crc kubenswrapper[4849]: I0320 13:28:35.829650 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wmj4\" (UniqueName: \"kubernetes.io/projected/4855b8cf-a062-487c-bf23-49fd7f919e7a-kube-api-access-4wmj4\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.267446 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" event={"ID":"99301b48-4cf1-41aa-bbb8-46166e443369","Type":"ContainerStarted","Data":"56a5e4f2eecb8bc33b9643e125028ea6b5b1874a4c9f763af511e38e45ed0278"} Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.267513 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" event={"ID":"99301b48-4cf1-41aa-bbb8-46166e443369","Type":"ContainerStarted","Data":"e36b53226ae0927ca1688f1bca8d527f55c87b9590f6afc62015a029b62b1428"} Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.267928 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.270112 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerStarted","Data":"1072baad1d600c9f917f9b2cbd8bd6691f8e4df5abd7eb57637b0ab94b61b9f3"} Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.281604 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.286497 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.286502 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-7cjjt" event={"ID":"4855b8cf-a062-487c-bf23-49fd7f919e7a","Type":"ContainerDied","Data":"75dc0126dd246a9f33b87fae6fcd9a3ccf970b4e0d669cd5d2d9794f1712f4ec"} Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.286783 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75dc0126dd246a9f33b87fae6fcd9a3ccf970b4e0d669cd5d2d9794f1712f4ec" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.289036 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.289047 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-9ps6c" event={"ID":"5b89502e-d430-490b-83df-7e4ba6393a51","Type":"ContainerDied","Data":"3c7e1e1d02137f44217f310530380f9b7d00aa1227fb7422c19b70fd8d603447"} Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.289094 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c7e1e1d02137f44217f310530380f9b7d00aa1227fb7422c19b70fd8d603447" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.328467 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qf9r2" podStartSLOduration=4.576426391 podStartE2EDuration="51.328438008s" podCreationTimestamp="2026-03-20 13:27:45 +0000 UTC" firstStartedPulling="2026-03-20 13:27:48.276138876 +0000 UTC m=+217.953862271" lastFinishedPulling="2026-03-20 13:28:35.028150493 +0000 UTC m=+264.705873888" observedRunningTime="2026-03-20 13:28:36.325846966 +0000 UTC m=+266.003570361" watchObservedRunningTime="2026-03-20 13:28:36.328438008 +0000 UTC m=+266.006161403" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.328619 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" podStartSLOduration=12.328614523 podStartE2EDuration="12.328614523s" podCreationTimestamp="2026-03-20 13:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:36.300100387 +0000 UTC m=+265.977823782" watchObservedRunningTime="2026-03-20 13:28:36.328614523 +0000 UTC m=+266.006337918" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.391552 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.391603 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.570709 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.571112 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.570714 4849 patch_prober.go:28] interesting pod/downloads-7954f5f757-mwgqm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.571368 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mwgqm" podUID="e0a6353b-f7df-4ef2-b5c0-e52f35646aba" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.638959 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.739074 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kubelet-dir\") pod \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.739308 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f" (UID: "bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.840002 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kube-api-access\") pod \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\" (UID: \"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f\") " Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.840213 4849 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.847290 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f" (UID: "bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:36 crc kubenswrapper[4849]: I0320 13:28:36.940919 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.055604 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" path="/var/lib/kubelet/pods/2fe5d2cf-cadf-48ce-9bed-549e618bec5b/volumes" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.056332 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b97f0a6-160b-4655-b3da-005f1b66e7ef" path="/var/lib/kubelet/pods/6b97f0a6-160b-4655-b3da-005f1b66e7ef/volumes" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.308805 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp"] Mar 20 13:28:37 crc kubenswrapper[4849]: E0320 13:28:37.309049 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b89502e-d430-490b-83df-7e4ba6393a51" containerName="oc" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309061 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b89502e-d430-490b-83df-7e4ba6393a51" containerName="oc" Mar 20 13:28:37 crc kubenswrapper[4849]: E0320 13:28:37.309076 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4855b8cf-a062-487c-bf23-49fd7f919e7a" containerName="oc" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309083 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="4855b8cf-a062-487c-bf23-49fd7f919e7a" containerName="oc" Mar 20 13:28:37 crc kubenswrapper[4849]: E0320 13:28:37.309093 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f" containerName="pruner" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309099 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f" containerName="pruner" Mar 20 13:28:37 crc kubenswrapper[4849]: E0320 13:28:37.309110 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" containerName="route-controller-manager" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309118 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" containerName="route-controller-manager" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309216 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fe5d2cf-cadf-48ce-9bed-549e618bec5b" containerName="route-controller-manager" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309226 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b89502e-d430-490b-83df-7e4ba6393a51" containerName="oc" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309235 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f" containerName="pruner" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309246 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="4855b8cf-a062-487c-bf23-49fd7f919e7a" containerName="oc" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.309616 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.313416 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.313772 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.313960 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.314276 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.314451 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.314611 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.318880 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.319400 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bb2dd0a4-bdf6-4a9c-8913-860f773c7c6f","Type":"ContainerDied","Data":"cf7818c813d28aebb56832dc38c78b783074c33d88d2e6880381c33517cb3c6f"} Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.319563 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf7818c813d28aebb56832dc38c78b783074c33d88d2e6880381c33517cb3c6f" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.325312 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp"] Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.327721 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft4dw" event={"ID":"b7396166-d1a2-4565-8ccc-3ed06ce215f4","Type":"ContainerStarted","Data":"64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c"} Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.367074 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ft4dw" podStartSLOduration=4.540886101 podStartE2EDuration="52.367055283s" podCreationTimestamp="2026-03-20 13:27:45 +0000 UTC" firstStartedPulling="2026-03-20 13:27:48.331904132 +0000 UTC m=+218.009627527" lastFinishedPulling="2026-03-20 13:28:36.158073314 +0000 UTC m=+265.835796709" observedRunningTime="2026-03-20 13:28:37.364551054 +0000 UTC m=+267.042274449" watchObservedRunningTime="2026-03-20 13:28:37.367055283 +0000 UTC m=+267.044778678" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.446446 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-config\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.446749 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-client-ca\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.446830 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949db4a7-19da-45b5-bfb8-1980f5ab99ee-serving-cert\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.446888 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6nws\" (UniqueName: \"kubernetes.io/projected/949db4a7-19da-45b5-bfb8-1980f5ab99ee-kube-api-access-g6nws\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.547664 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6nws\" (UniqueName: \"kubernetes.io/projected/949db4a7-19da-45b5-bfb8-1980f5ab99ee-kube-api-access-g6nws\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.548130 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-config\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.548151 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-client-ca\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.549343 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-client-ca\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.549452 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-config\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.549908 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949db4a7-19da-45b5-bfb8-1980f5ab99ee-serving-cert\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.553859 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949db4a7-19da-45b5-bfb8-1980f5ab99ee-serving-cert\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.566463 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6nws\" (UniqueName: \"kubernetes.io/projected/949db4a7-19da-45b5-bfb8-1980f5ab99ee-kube-api-access-g6nws\") pod \"route-controller-manager-654f7994fb-962rp\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:37 crc kubenswrapper[4849]: I0320 13:28:37.637306 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:38 crc kubenswrapper[4849]: I0320 13:28:38.021455 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qf9r2" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="registry-server" probeResult="failure" output=< Mar 20 13:28:38 crc kubenswrapper[4849]: timeout: failed to connect service ":50051" within 1s Mar 20 13:28:38 crc kubenswrapper[4849]: > Mar 20 13:28:38 crc kubenswrapper[4849]: I0320 13:28:38.227049 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp"] Mar 20 13:28:38 crc kubenswrapper[4849]: W0320 13:28:38.231470 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod949db4a7_19da_45b5_bfb8_1980f5ab99ee.slice/crio-fa1f1f44506a0fd0e15c40e48cbde553ba35bf9aa33e5f10b81c26aff311dc30 WatchSource:0}: Error finding container fa1f1f44506a0fd0e15c40e48cbde553ba35bf9aa33e5f10b81c26aff311dc30: Status 404 returned error can't find the container with id fa1f1f44506a0fd0e15c40e48cbde553ba35bf9aa33e5f10b81c26aff311dc30 Mar 20 13:28:38 crc kubenswrapper[4849]: I0320 13:28:38.335595 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" event={"ID":"949db4a7-19da-45b5-bfb8-1980f5ab99ee","Type":"ContainerStarted","Data":"fa1f1f44506a0fd0e15c40e48cbde553ba35bf9aa33e5f10b81c26aff311dc30"} Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.343710 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" event={"ID":"949db4a7-19da-45b5-bfb8-1980f5ab99ee","Type":"ContainerStarted","Data":"9b8c184f4629c9f45c066cd025447df7535737f77d8f03ee3e9dcb302d973c5f"} Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.344349 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.351910 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.361623 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" podStartSLOduration=15.361609626 podStartE2EDuration="15.361609626s" podCreationTimestamp="2026-03-20 13:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:39.360455934 +0000 UTC m=+269.038179339" watchObservedRunningTime="2026-03-20 13:28:39.361609626 +0000 UTC m=+269.039333021" Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.384957 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.385012 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.385052 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.385597 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:28:39 crc kubenswrapper[4849]: I0320 13:28:39.385645 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa" gracePeriod=600 Mar 20 13:28:40 crc kubenswrapper[4849]: I0320 13:28:40.352773 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa" exitCode=0 Mar 20 13:28:40 crc kubenswrapper[4849]: I0320 13:28:40.352845 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa"} Mar 20 13:28:41 crc kubenswrapper[4849]: I0320 13:28:41.361879 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"f14fd97e3ce2e8670a2db11e0c02e2fc9f8ae8117a289b185f11ac430a24ae2d"} Mar 20 13:28:44 crc kubenswrapper[4849]: I0320 13:28:44.431973 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m"] Mar 20 13:28:44 crc kubenswrapper[4849]: I0320 13:28:44.432886 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" podUID="99301b48-4cf1-41aa-bbb8-46166e443369" containerName="controller-manager" containerID="cri-o://56a5e4f2eecb8bc33b9643e125028ea6b5b1874a4c9f763af511e38e45ed0278" gracePeriod=30 Mar 20 13:28:44 crc kubenswrapper[4849]: I0320 13:28:44.467033 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp"] Mar 20 13:28:44 crc kubenswrapper[4849]: I0320 13:28:44.467230 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" podUID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" containerName="route-controller-manager" containerID="cri-o://9b8c184f4629c9f45c066cd025447df7535737f77d8f03ee3e9dcb302d973c5f" gracePeriod=30 Mar 20 13:28:44 crc kubenswrapper[4849]: I0320 13:28:44.781336 4849 patch_prober.go:28] interesting pod/controller-manager-5bf7f8c6cf-sc97m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 20 13:28:44 crc kubenswrapper[4849]: I0320 13:28:44.781404 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" podUID="99301b48-4cf1-41aa-bbb8-46166e443369" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.392060 4849 generic.go:334] "Generic (PLEG): container finished" podID="99301b48-4cf1-41aa-bbb8-46166e443369" containerID="56a5e4f2eecb8bc33b9643e125028ea6b5b1874a4c9f763af511e38e45ed0278" exitCode=0 Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.392149 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" event={"ID":"99301b48-4cf1-41aa-bbb8-46166e443369","Type":"ContainerDied","Data":"56a5e4f2eecb8bc33b9643e125028ea6b5b1874a4c9f763af511e38e45ed0278"} Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.393981 4849 generic.go:334] "Generic (PLEG): container finished" podID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" containerID="9b8c184f4629c9f45c066cd025447df7535737f77d8f03ee3e9dcb302d973c5f" exitCode=0 Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.394012 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" event={"ID":"949db4a7-19da-45b5-bfb8-1980f5ab99ee","Type":"ContainerDied","Data":"9b8c184f4629c9f45c066cd025447df7535737f77d8f03ee3e9dcb302d973c5f"} Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.899365 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.899463 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:28:45 crc kubenswrapper[4849]: I0320 13:28:45.964155 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:28:46 crc kubenswrapper[4849]: I0320 13:28:46.460870 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:28:46 crc kubenswrapper[4849]: I0320 13:28:46.461571 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:28:46 crc kubenswrapper[4849]: I0320 13:28:46.509977 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:28:46 crc kubenswrapper[4849]: I0320 13:28:46.576335 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mwgqm" Mar 20 13:28:47 crc kubenswrapper[4849]: I0320 13:28:47.638134 4849 patch_prober.go:28] interesting pod/route-controller-manager-654f7994fb-962rp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 20 13:28:47 crc kubenswrapper[4849]: I0320 13:28:47.638504 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" podUID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 20 13:28:47 crc kubenswrapper[4849]: I0320 13:28:47.663572 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qf9r2"] Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.268747 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.294775 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv"] Mar 20 13:28:48 crc kubenswrapper[4849]: E0320 13:28:48.295063 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" containerName="route-controller-manager" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.295088 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" containerName="route-controller-manager" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.295222 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" containerName="route-controller-manager" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.298545 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.314350 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv"] Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.396449 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-config\") pod \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.396520 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949db4a7-19da-45b5-bfb8-1980f5ab99ee-serving-cert\") pod \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.396570 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6nws\" (UniqueName: \"kubernetes.io/projected/949db4a7-19da-45b5-bfb8-1980f5ab99ee-kube-api-access-g6nws\") pod \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.396618 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-client-ca\") pod \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\" (UID: \"949db4a7-19da-45b5-bfb8-1980f5ab99ee\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.398182 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-config" (OuterVolumeSpecName: "config") pod "949db4a7-19da-45b5-bfb8-1980f5ab99ee" (UID: "949db4a7-19da-45b5-bfb8-1980f5ab99ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.398246 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "949db4a7-19da-45b5-bfb8-1980f5ab99ee" (UID: "949db4a7-19da-45b5-bfb8-1980f5ab99ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.403987 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949db4a7-19da-45b5-bfb8-1980f5ab99ee-kube-api-access-g6nws" (OuterVolumeSpecName: "kube-api-access-g6nws") pod "949db4a7-19da-45b5-bfb8-1980f5ab99ee" (UID: "949db4a7-19da-45b5-bfb8-1980f5ab99ee"). InnerVolumeSpecName "kube-api-access-g6nws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.404048 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949db4a7-19da-45b5-bfb8-1980f5ab99ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "949db4a7-19da-45b5-bfb8-1980f5ab99ee" (UID: "949db4a7-19da-45b5-bfb8-1980f5ab99ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.417943 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" event={"ID":"949db4a7-19da-45b5-bfb8-1980f5ab99ee","Type":"ContainerDied","Data":"fa1f1f44506a0fd0e15c40e48cbde553ba35bf9aa33e5f10b81c26aff311dc30"} Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.417967 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.418208 4849 scope.go:117] "RemoveContainer" containerID="9b8c184f4629c9f45c066cd025447df7535737f77d8f03ee3e9dcb302d973c5f" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.420815 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" event={"ID":"99301b48-4cf1-41aa-bbb8-46166e443369","Type":"ContainerDied","Data":"e36b53226ae0927ca1688f1bca8d527f55c87b9590f6afc62015a029b62b1428"} Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.420886 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e36b53226ae0927ca1688f1bca8d527f55c87b9590f6afc62015a029b62b1428" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.420963 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qf9r2" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="registry-server" containerID="cri-o://1072baad1d600c9f917f9b2cbd8bd6691f8e4df5abd7eb57637b0ab94b61b9f3" gracePeriod=2 Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.444672 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.454306 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp"] Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.457282 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-654f7994fb-962rp"] Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.498486 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-proxy-ca-bundles\") pod \"99301b48-4cf1-41aa-bbb8-46166e443369\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.498778 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvvvl\" (UniqueName: \"kubernetes.io/projected/99301b48-4cf1-41aa-bbb8-46166e443369-kube-api-access-mvvvl\") pod \"99301b48-4cf1-41aa-bbb8-46166e443369\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.498911 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99301b48-4cf1-41aa-bbb8-46166e443369-serving-cert\") pod \"99301b48-4cf1-41aa-bbb8-46166e443369\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.499118 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-config\") pod \"99301b48-4cf1-41aa-bbb8-46166e443369\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.499675 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-client-ca\") pod \"99301b48-4cf1-41aa-bbb8-46166e443369\" (UID: \"99301b48-4cf1-41aa-bbb8-46166e443369\") " Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.499922 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3119af-6692-427f-9114-85cb697af99e-serving-cert\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.500046 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xwrj\" (UniqueName: \"kubernetes.io/projected/9c3119af-6692-427f-9114-85cb697af99e-kube-api-access-7xwrj\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.500165 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-client-ca\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.501209 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-config\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.500992 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-client-ca" (OuterVolumeSpecName: "client-ca") pod "99301b48-4cf1-41aa-bbb8-46166e443369" (UID: "99301b48-4cf1-41aa-bbb8-46166e443369"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.501133 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99301b48-4cf1-41aa-bbb8-46166e443369" (UID: "99301b48-4cf1-41aa-bbb8-46166e443369"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.501145 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-config" (OuterVolumeSpecName: "config") pod "99301b48-4cf1-41aa-bbb8-46166e443369" (UID: "99301b48-4cf1-41aa-bbb8-46166e443369"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.501646 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6nws\" (UniqueName: \"kubernetes.io/projected/949db4a7-19da-45b5-bfb8-1980f5ab99ee-kube-api-access-g6nws\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.501760 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.501989 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949db4a7-19da-45b5-bfb8-1980f5ab99ee-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.502090 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949db4a7-19da-45b5-bfb8-1980f5ab99ee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.504454 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99301b48-4cf1-41aa-bbb8-46166e443369-kube-api-access-mvvvl" (OuterVolumeSpecName: "kube-api-access-mvvvl") pod "99301b48-4cf1-41aa-bbb8-46166e443369" (UID: "99301b48-4cf1-41aa-bbb8-46166e443369"). InnerVolumeSpecName "kube-api-access-mvvvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.504978 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99301b48-4cf1-41aa-bbb8-46166e443369-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99301b48-4cf1-41aa-bbb8-46166e443369" (UID: "99301b48-4cf1-41aa-bbb8-46166e443369"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603451 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3119af-6692-427f-9114-85cb697af99e-serving-cert\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603537 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xwrj\" (UniqueName: \"kubernetes.io/projected/9c3119af-6692-427f-9114-85cb697af99e-kube-api-access-7xwrj\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603572 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-client-ca\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603616 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-config\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603719 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99301b48-4cf1-41aa-bbb8-46166e443369-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603738 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603749 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603759 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99301b48-4cf1-41aa-bbb8-46166e443369-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.603771 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvvvl\" (UniqueName: \"kubernetes.io/projected/99301b48-4cf1-41aa-bbb8-46166e443369-kube-api-access-mvvvl\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.604997 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-client-ca\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.605664 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-config\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.607966 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3119af-6692-427f-9114-85cb697af99e-serving-cert\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.622533 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xwrj\" (UniqueName: \"kubernetes.io/projected/9c3119af-6692-427f-9114-85cb697af99e-kube-api-access-7xwrj\") pod \"route-controller-manager-5b875597f4-428nv\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:48 crc kubenswrapper[4849]: I0320 13:28:48.920898 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.048225 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949db4a7-19da-45b5-bfb8-1980f5ab99ee" path="/var/lib/kubelet/pods/949db4a7-19da-45b5-bfb8-1980f5ab99ee/volumes" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.368117 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv"] Mar 20 13:28:49 crc kubenswrapper[4849]: W0320 13:28:49.396210 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c3119af_6692_427f_9114_85cb697af99e.slice/crio-0bb34c0e905019f147dc6f2313ecda868a0091000784c40aad84c663021632d3 WatchSource:0}: Error finding container 0bb34c0e905019f147dc6f2313ecda868a0091000784c40aad84c663021632d3: Status 404 returned error can't find the container with id 0bb34c0e905019f147dc6f2313ecda868a0091000784c40aad84c663021632d3 Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.427148 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerStarted","Data":"0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab"} Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.430270 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerStarted","Data":"561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193"} Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.440861 4849 generic.go:334] "Generic (PLEG): container finished" podID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerID="1072baad1d600c9f917f9b2cbd8bd6691f8e4df5abd7eb57637b0ab94b61b9f3" exitCode=0 Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.440925 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerDied","Data":"1072baad1d600c9f917f9b2cbd8bd6691f8e4df5abd7eb57637b0ab94b61b9f3"} Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.461232 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" event={"ID":"9c3119af-6692-427f-9114-85cb697af99e","Type":"ContainerStarted","Data":"0bb34c0e905019f147dc6f2313ecda868a0091000784c40aad84c663021632d3"} Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.463033 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.463560 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerStarted","Data":"59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8"} Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.495933 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m"] Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.500562 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bf7f8c6cf-sc97m"] Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.585147 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.624572 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fvxk\" (UniqueName: \"kubernetes.io/projected/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-kube-api-access-9fvxk\") pod \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.624641 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-catalog-content\") pod \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.624717 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-utilities\") pod \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\" (UID: \"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e\") " Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.625913 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-utilities" (OuterVolumeSpecName: "utilities") pod "5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" (UID: "5334df4b-9d2b-41c6-a18d-07c1c4edfd4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.651662 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-kube-api-access-9fvxk" (OuterVolumeSpecName: "kube-api-access-9fvxk") pod "5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" (UID: "5334df4b-9d2b-41c6-a18d-07c1c4edfd4e"). InnerVolumeSpecName "kube-api-access-9fvxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.708689 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" (UID: "5334df4b-9d2b-41c6-a18d-07c1c4edfd4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.726514 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fvxk\" (UniqueName: \"kubernetes.io/projected/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-kube-api-access-9fvxk\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.726546 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.726556 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:49 crc kubenswrapper[4849]: I0320 13:28:49.881338 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" podUID="f1f2af94-ce72-498b-a231-d171ab0e8760" containerName="oauth-openshift" containerID="cri-o://4026a7972ffb2d250f7fe26dd8f15e0561ad74b2bbd6b02d7f983d7364910223" gracePeriod=15 Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.314858 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bd5d45fc8-5smfc"] Mar 20 13:28:50 crc kubenswrapper[4849]: E0320 13:28:50.315688 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="extract-utilities" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.315706 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="extract-utilities" Mar 20 13:28:50 crc kubenswrapper[4849]: E0320 13:28:50.315719 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99301b48-4cf1-41aa-bbb8-46166e443369" containerName="controller-manager" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.315725 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="99301b48-4cf1-41aa-bbb8-46166e443369" containerName="controller-manager" Mar 20 13:28:50 crc kubenswrapper[4849]: E0320 13:28:50.315740 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="extract-content" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.315746 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="extract-content" Mar 20 13:28:50 crc kubenswrapper[4849]: E0320 13:28:50.315761 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="registry-server" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.315768 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="registry-server" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.315903 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" containerName="registry-server" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.315926 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="99301b48-4cf1-41aa-bbb8-46166e443369" containerName="controller-manager" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.316444 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.321995 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.328386 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.328504 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.328388 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.328589 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.329304 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.333061 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd5d45fc8-5smfc"] Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.337577 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-proxy-ca-bundles\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.337646 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-client-ca\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.337687 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-config\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.337709 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-serving-cert\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.337736 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc24n\" (UniqueName: \"kubernetes.io/projected/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-kube-api-access-sc24n\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.337844 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.445035 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-client-ca\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.445101 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-config\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.445127 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-serving-cert\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.445165 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc24n\" (UniqueName: \"kubernetes.io/projected/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-kube-api-access-sc24n\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.445196 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-proxy-ca-bundles\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.446354 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-client-ca\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.448284 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-config\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.449489 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-proxy-ca-bundles\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.460172 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-serving-cert\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.462998 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc24n\" (UniqueName: \"kubernetes.io/projected/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-kube-api-access-sc24n\") pod \"controller-manager-bd5d45fc8-5smfc\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.471560 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" event={"ID":"9c3119af-6692-427f-9114-85cb697af99e","Type":"ContainerStarted","Data":"60b58bf2e4a9145de28ff42f18b20ae2017da1b5efa1ffed5b1b6123d3ee367f"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.471842 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.478308 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerStarted","Data":"a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.479116 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.480609 4849 generic.go:334] "Generic (PLEG): container finished" podID="63553d28-5dba-492e-b004-043ea30ee635" containerID="0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab" exitCode=0 Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.480658 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerDied","Data":"0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.492023 4849 generic.go:334] "Generic (PLEG): container finished" podID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerID="561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193" exitCode=0 Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.492110 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerDied","Data":"561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.499582 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.505800 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" podStartSLOduration=6.505777149 podStartE2EDuration="6.505777149s" podCreationTimestamp="2026-03-20 13:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:50.499519836 +0000 UTC m=+280.177243241" watchObservedRunningTime="2026-03-20 13:28:50.505777149 +0000 UTC m=+280.183500544" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.513424 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerStarted","Data":"e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.529403 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.529898 4849 generic.go:334] "Generic (PLEG): container finished" podID="f1f2af94-ce72-498b-a231-d171ab0e8760" containerID="4026a7972ffb2d250f7fe26dd8f15e0561ad74b2bbd6b02d7f983d7364910223" exitCode=0 Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.529998 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" event={"ID":"f1f2af94-ce72-498b-a231-d171ab0e8760","Type":"ContainerDied","Data":"4026a7972ffb2d250f7fe26dd8f15e0561ad74b2bbd6b02d7f983d7364910223"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.530035 4849 scope.go:117] "RemoveContainer" containerID="4026a7972ffb2d250f7fe26dd8f15e0561ad74b2bbd6b02d7f983d7364910223" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.552020 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerStarted","Data":"10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.558415 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qf9r2" event={"ID":"5334df4b-9d2b-41c6-a18d-07c1c4edfd4e","Type":"ContainerDied","Data":"ef26b5e699f9eda4372e9cccacac3868b1a718aa32de1daac348a66b5a75d0c2"} Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.558599 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qf9r2" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.599149 4849 scope.go:117] "RemoveContainer" containerID="1072baad1d600c9f917f9b2cbd8bd6691f8e4df5abd7eb57637b0ab94b61b9f3" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.657298 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-error\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.657680 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-login\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.657896 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-dir\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.657936 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-policies\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658103 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-cliconfig\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658137 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-idp-0-file-data\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658350 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-router-certs\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658374 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-serving-cert\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658403 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-service-ca\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658610 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccnh5\" (UniqueName: \"kubernetes.io/projected/f1f2af94-ce72-498b-a231-d171ab0e8760-kube-api-access-ccnh5\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658655 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-session\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658851 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-trusted-ca-bundle\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.658885 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-ocp-branding-template\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.659054 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-provider-selection\") pod \"f1f2af94-ce72-498b-a231-d171ab0e8760\" (UID: \"f1f2af94-ce72-498b-a231-d171ab0e8760\") " Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.667568 4849 scope.go:117] "RemoveContainer" containerID="975f4e3ad717bd86822dc1225bf93aeea634d7107c6aaffaecdf2689d11fa64c" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.668994 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.669842 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.670643 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.674290 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.683521 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.707665 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f2af94-ce72-498b-a231-d171ab0e8760-kube-api-access-ccnh5" (OuterVolumeSpecName: "kube-api-access-ccnh5") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "kube-api-access-ccnh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.725779 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.737959 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qf9r2"] Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.740116 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qf9r2"] Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.741426 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.749942 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.751618 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.755344 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761742 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761786 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761796 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761806 4849 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761832 4849 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761844 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761880 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761895 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761917 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761929 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.761939 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccnh5\" (UniqueName: \"kubernetes.io/projected/f1f2af94-ce72-498b-a231-d171ab0e8760-kube-api-access-ccnh5\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.763038 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.773542 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.774067 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f1f2af94-ce72-498b-a231-d171ab0e8760" (UID: "f1f2af94-ce72-498b-a231-d171ab0e8760"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.819234 4849 scope.go:117] "RemoveContainer" containerID="509874ccafb17caeb23da1e9f48b80faca8b480febc2c9037da606b6b7708a63" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.862603 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.862635 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:50 crc kubenswrapper[4849]: I0320 13:28:50.862649 4849 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f1f2af94-ce72-498b-a231-d171ab0e8760-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.059455 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5334df4b-9d2b-41c6-a18d-07c1c4edfd4e" path="/var/lib/kubelet/pods/5334df4b-9d2b-41c6-a18d-07c1c4edfd4e/volumes" Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.060347 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99301b48-4cf1-41aa-bbb8-46166e443369" path="/var/lib/kubelet/pods/99301b48-4cf1-41aa-bbb8-46166e443369/volumes" Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.177011 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bd5d45fc8-5smfc"] Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.566913 4849 generic.go:334] "Generic (PLEG): container finished" podID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerID="10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b" exitCode=0 Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.566980 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerDied","Data":"10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b"} Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.572265 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" event={"ID":"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3","Type":"ContainerStarted","Data":"c1f3fd2374c6ef96f61d5dd29cc0e858fb70efe8b0b7b08beae7449a52bed826"} Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.573880 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerID="e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935" exitCode=0 Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.573934 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerDied","Data":"e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935"} Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.576651 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.577072 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lfr7p" event={"ID":"f1f2af94-ce72-498b-a231-d171ab0e8760","Type":"ContainerDied","Data":"7f6a81a1ea195eb380f609a8d85d21420d7c6a0122f97481aeafeededf2645d1"} Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.600090 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfr7p"] Mar 20 13:28:51 crc kubenswrapper[4849]: I0320 13:28:51.602796 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lfr7p"] Mar 20 13:28:52 crc kubenswrapper[4849]: I0320 13:28:52.597527 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" event={"ID":"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3","Type":"ContainerStarted","Data":"77a4eb77b8e8de345437ff0145b57816de5e57cc353171a09e88ac5d1160aa40"} Mar 20 13:28:52 crc kubenswrapper[4849]: I0320 13:28:52.620397 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" podStartSLOduration=8.620378688 podStartE2EDuration="8.620378688s" podCreationTimestamp="2026-03-20 13:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:52.618058274 +0000 UTC m=+282.295781679" watchObservedRunningTime="2026-03-20 13:28:52.620378688 +0000 UTC m=+282.298102083" Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.041954 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f2af94-ce72-498b-a231-d171ab0e8760" path="/var/lib/kubelet/pods/f1f2af94-ce72-498b-a231-d171ab0e8760/volumes" Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.604757 4849 generic.go:334] "Generic (PLEG): container finished" podID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerID="a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b" exitCode=0 Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.604803 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerDied","Data":"a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b"} Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.606803 4849 generic.go:334] "Generic (PLEG): container finished" podID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerID="59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8" exitCode=0 Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.606843 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerDied","Data":"59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8"} Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.607363 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:53 crc kubenswrapper[4849]: I0320 13:28:53.611424 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:28:55 crc kubenswrapper[4849]: I0320 13:28:55.621764 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerStarted","Data":"989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986"} Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.320542 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75894779c6-lnrrs"] Mar 20 13:28:56 crc kubenswrapper[4849]: E0320 13:28:56.320867 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f2af94-ce72-498b-a231-d171ab0e8760" containerName="oauth-openshift" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.320881 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f2af94-ce72-498b-a231-d171ab0e8760" containerName="oauth-openshift" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.321082 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f2af94-ce72-498b-a231-d171ab0e8760" containerName="oauth-openshift" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.321551 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.324421 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.324562 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.324687 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.324978 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.325157 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.325219 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.325167 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.325343 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.325387 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.325686 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.327047 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.331203 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.334697 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.339221 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75894779c6-lnrrs"] Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.340869 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.348141 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434351 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434414 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434442 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434461 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-session\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434477 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434493 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-error\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434694 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-login\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434781 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434810 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445kr\" (UniqueName: \"kubernetes.io/projected/39c86da4-0d67-40e2-9e20-e886d603f5b7-kube-api-access-445kr\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434885 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-audit-policies\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434911 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434956 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.434985 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.435037 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39c86da4-0d67-40e2-9e20-e886d603f5b7-audit-dir\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537274 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537325 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537349 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-session\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537370 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537386 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-error\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537431 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-login\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537457 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537478 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-445kr\" (UniqueName: \"kubernetes.io/projected/39c86da4-0d67-40e2-9e20-e886d603f5b7-kube-api-access-445kr\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537503 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-audit-policies\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537519 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537541 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537558 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537578 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39c86da4-0d67-40e2-9e20-e886d603f5b7-audit-dir\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.537601 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.538885 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/39c86da4-0d67-40e2-9e20-e886d603f5b7-audit-dir\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.539404 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-service-ca\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.539548 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.539706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-audit-policies\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.540122 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.543621 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-error\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.544148 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.544409 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-session\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.544569 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.544574 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-template-login\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.544802 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.544944 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-router-certs\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.545380 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/39c86da4-0d67-40e2-9e20-e886d603f5b7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.561134 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-445kr\" (UniqueName: \"kubernetes.io/projected/39c86da4-0d67-40e2-9e20-e886d603f5b7-kube-api-access-445kr\") pod \"oauth-openshift-75894779c6-lnrrs\" (UID: \"39c86da4-0d67-40e2-9e20-e886d603f5b7\") " pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.643125 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5zspj" podStartSLOduration=5.461533395 podStartE2EDuration="1m11.643104978s" podCreationTimestamp="2026-03-20 13:27:45 +0000 UTC" firstStartedPulling="2026-03-20 13:27:48.316734974 +0000 UTC m=+217.994458369" lastFinishedPulling="2026-03-20 13:28:54.498306557 +0000 UTC m=+284.176029952" observedRunningTime="2026-03-20 13:28:56.64208312 +0000 UTC m=+286.319806515" watchObservedRunningTime="2026-03-20 13:28:56.643104978 +0000 UTC m=+286.320828373" Mar 20 13:28:56 crc kubenswrapper[4849]: I0320 13:28:56.649525 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:28:59 crc kubenswrapper[4849]: I0320 13:28:59.285587 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75894779c6-lnrrs"] Mar 20 13:28:59 crc kubenswrapper[4849]: W0320 13:28:59.299751 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c86da4_0d67_40e2_9e20_e886d603f5b7.slice/crio-73404e97c1fb131de8213802434c438db32dbf20004f8e0faa0929d30895b70b WatchSource:0}: Error finding container 73404e97c1fb131de8213802434c438db32dbf20004f8e0faa0929d30895b70b: Status 404 returned error can't find the container with id 73404e97c1fb131de8213802434c438db32dbf20004f8e0faa0929d30895b70b Mar 20 13:28:59 crc kubenswrapper[4849]: I0320 13:28:59.643087 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" event={"ID":"39c86da4-0d67-40e2-9e20-e886d603f5b7","Type":"ContainerStarted","Data":"73404e97c1fb131de8213802434c438db32dbf20004f8e0faa0929d30895b70b"} Mar 20 13:28:59 crc kubenswrapper[4849]: I0320 13:28:59.645276 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerStarted","Data":"613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4"} Mar 20 13:28:59 crc kubenswrapper[4849]: I0320 13:28:59.647575 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerStarted","Data":"c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7"} Mar 20 13:28:59 crc kubenswrapper[4849]: I0320 13:28:59.649616 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerStarted","Data":"072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca"} Mar 20 13:29:00 crc kubenswrapper[4849]: I0320 13:29:00.662606 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" event={"ID":"39c86da4-0d67-40e2-9e20-e886d603f5b7","Type":"ContainerStarted","Data":"589faa29898e2a6ca7b153f2316a9a8d1493f19febbea1b84551188f17b89dfc"} Mar 20 13:29:00 crc kubenswrapper[4849]: I0320 13:29:00.682994 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dgk97" podStartSLOduration=4.33492762 podStartE2EDuration="1m13.68297427s" podCreationTimestamp="2026-03-20 13:27:47 +0000 UTC" firstStartedPulling="2026-03-20 13:27:49.416164166 +0000 UTC m=+219.093887561" lastFinishedPulling="2026-03-20 13:28:58.764210816 +0000 UTC m=+288.441934211" observedRunningTime="2026-03-20 13:29:00.679732081 +0000 UTC m=+290.357455496" watchObservedRunningTime="2026-03-20 13:29:00.68297427 +0000 UTC m=+290.360697665" Mar 20 13:29:00 crc kubenswrapper[4849]: I0320 13:29:00.703754 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xx4fv" podStartSLOduration=5.496979101 podStartE2EDuration="1m15.703734382s" podCreationTimestamp="2026-03-20 13:27:45 +0000 UTC" firstStartedPulling="2026-03-20 13:27:47.18995482 +0000 UTC m=+216.867678215" lastFinishedPulling="2026-03-20 13:28:57.396710091 +0000 UTC m=+287.074433496" observedRunningTime="2026-03-20 13:29:00.696702929 +0000 UTC m=+290.374426324" watchObservedRunningTime="2026-03-20 13:29:00.703734382 +0000 UTC m=+290.381457777" Mar 20 13:29:00 crc kubenswrapper[4849]: I0320 13:29:00.726192 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zwxp" podStartSLOduration=4.543156797 podStartE2EDuration="1m13.726171481s" podCreationTimestamp="2026-03-20 13:27:47 +0000 UTC" firstStartedPulling="2026-03-20 13:27:49.353945061 +0000 UTC m=+219.031668446" lastFinishedPulling="2026-03-20 13:28:58.536959715 +0000 UTC m=+288.214683130" observedRunningTime="2026-03-20 13:29:00.722952542 +0000 UTC m=+290.400675947" watchObservedRunningTime="2026-03-20 13:29:00.726171481 +0000 UTC m=+290.403894886" Mar 20 13:29:01 crc kubenswrapper[4849]: I0320 13:29:01.667270 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:29:01 crc kubenswrapper[4849]: I0320 13:29:01.673591 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" Mar 20 13:29:01 crc kubenswrapper[4849]: I0320 13:29:01.690925 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75894779c6-lnrrs" podStartSLOduration=37.69090391 podStartE2EDuration="37.69090391s" podCreationTimestamp="2026-03-20 13:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:29:01.684500224 +0000 UTC m=+291.362223619" watchObservedRunningTime="2026-03-20 13:29:01.69090391 +0000 UTC m=+291.368627305" Mar 20 13:29:02 crc kubenswrapper[4849]: I0320 13:29:02.683771 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerStarted","Data":"de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f"} Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.444379 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lnk65" podStartSLOduration=6.073784057 podStartE2EDuration="1m16.44435796s" podCreationTimestamp="2026-03-20 13:27:48 +0000 UTC" firstStartedPulling="2026-03-20 13:27:51.523143394 +0000 UTC m=+221.200866789" lastFinishedPulling="2026-03-20 13:29:01.893717297 +0000 UTC m=+291.571440692" observedRunningTime="2026-03-20 13:29:03.712517457 +0000 UTC m=+293.390240872" watchObservedRunningTime="2026-03-20 13:29:04.44435796 +0000 UTC m=+294.122081355" Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.445404 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd5d45fc8-5smfc"] Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.445592 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" podUID="a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" containerName="controller-manager" containerID="cri-o://77a4eb77b8e8de345437ff0145b57816de5e57cc353171a09e88ac5d1160aa40" gracePeriod=30 Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.544192 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv"] Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.544470 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" podUID="9c3119af-6692-427f-9114-85cb697af99e" containerName="route-controller-manager" containerID="cri-o://60b58bf2e4a9145de28ff42f18b20ae2017da1b5efa1ffed5b1b6123d3ee367f" gracePeriod=30 Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.708063 4849 generic.go:334] "Generic (PLEG): container finished" podID="a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" containerID="77a4eb77b8e8de345437ff0145b57816de5e57cc353171a09e88ac5d1160aa40" exitCode=0 Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.708151 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" event={"ID":"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3","Type":"ContainerDied","Data":"77a4eb77b8e8de345437ff0145b57816de5e57cc353171a09e88ac5d1160aa40"} Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.711124 4849 generic.go:334] "Generic (PLEG): container finished" podID="9c3119af-6692-427f-9114-85cb697af99e" containerID="60b58bf2e4a9145de28ff42f18b20ae2017da1b5efa1ffed5b1b6123d3ee367f" exitCode=0 Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.711216 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" event={"ID":"9c3119af-6692-427f-9114-85cb697af99e","Type":"ContainerDied","Data":"60b58bf2e4a9145de28ff42f18b20ae2017da1b5efa1ffed5b1b6123d3ee367f"} Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.716211 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerStarted","Data":"5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609"} Mar 20 13:29:04 crc kubenswrapper[4849]: I0320 13:29:04.739072 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v2fkf" podStartSLOduration=4.835551473 podStartE2EDuration="1m16.739037239s" podCreationTimestamp="2026-03-20 13:27:48 +0000 UTC" firstStartedPulling="2026-03-20 13:27:51.522166038 +0000 UTC m=+221.199889433" lastFinishedPulling="2026-03-20 13:29:03.425651794 +0000 UTC m=+293.103375199" observedRunningTime="2026-03-20 13:29:04.732481638 +0000 UTC m=+294.410205053" watchObservedRunningTime="2026-03-20 13:29:04.739037239 +0000 UTC m=+294.416760634" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.067175 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.071373 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161443 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc24n\" (UniqueName: \"kubernetes.io/projected/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-kube-api-access-sc24n\") pod \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161513 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-serving-cert\") pod \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161587 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-config\") pod \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161628 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-client-ca\") pod \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161666 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-proxy-ca-bundles\") pod \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\" (UID: \"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161692 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-config\") pod \"9c3119af-6692-427f-9114-85cb697af99e\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161720 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xwrj\" (UniqueName: \"kubernetes.io/projected/9c3119af-6692-427f-9114-85cb697af99e-kube-api-access-7xwrj\") pod \"9c3119af-6692-427f-9114-85cb697af99e\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161755 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-client-ca\") pod \"9c3119af-6692-427f-9114-85cb697af99e\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.161797 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3119af-6692-427f-9114-85cb697af99e-serving-cert\") pod \"9c3119af-6692-427f-9114-85cb697af99e\" (UID: \"9c3119af-6692-427f-9114-85cb697af99e\") " Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.162672 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c3119af-6692-427f-9114-85cb697af99e" (UID: "9c3119af-6692-427f-9114-85cb697af99e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.162725 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-config" (OuterVolumeSpecName: "config") pod "9c3119af-6692-427f-9114-85cb697af99e" (UID: "9c3119af-6692-427f-9114-85cb697af99e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.163564 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" (UID: "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.163553 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" (UID: "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.163814 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-config" (OuterVolumeSpecName: "config") pod "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" (UID: "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.168531 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3119af-6692-427f-9114-85cb697af99e-kube-api-access-7xwrj" (OuterVolumeSpecName: "kube-api-access-7xwrj") pod "9c3119af-6692-427f-9114-85cb697af99e" (UID: "9c3119af-6692-427f-9114-85cb697af99e"). InnerVolumeSpecName "kube-api-access-7xwrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.168541 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" (UID: "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.168646 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c3119af-6692-427f-9114-85cb697af99e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c3119af-6692-427f-9114-85cb697af99e" (UID: "9c3119af-6692-427f-9114-85cb697af99e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.169152 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-kube-api-access-sc24n" (OuterVolumeSpecName: "kube-api-access-sc24n") pod "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" (UID: "a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3"). InnerVolumeSpecName "kube-api-access-sc24n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264100 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264330 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264347 4849 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264361 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264371 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xwrj\" (UniqueName: \"kubernetes.io/projected/9c3119af-6692-427f-9114-85cb697af99e-kube-api-access-7xwrj\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264388 4849 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c3119af-6692-427f-9114-85cb697af99e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264396 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c3119af-6692-427f-9114-85cb697af99e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264404 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc24n\" (UniqueName: \"kubernetes.io/projected/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-kube-api-access-sc24n\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.264412 4849 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.592109 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.592259 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.645342 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.723442 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" event={"ID":"a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3","Type":"ContainerDied","Data":"c1f3fd2374c6ef96f61d5dd29cc0e858fb70efe8b0b7b08beae7449a52bed826"} Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.723472 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bd5d45fc8-5smfc" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.723550 4849 scope.go:117] "RemoveContainer" containerID="77a4eb77b8e8de345437ff0145b57816de5e57cc353171a09e88ac5d1160aa40" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.724997 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.725067 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv" event={"ID":"9c3119af-6692-427f-9114-85cb697af99e","Type":"ContainerDied","Data":"0bb34c0e905019f147dc6f2313ecda868a0091000784c40aad84c663021632d3"} Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.743019 4849 scope.go:117] "RemoveContainer" containerID="60b58bf2e4a9145de28ff42f18b20ae2017da1b5efa1ffed5b1b6123d3ee367f" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.763770 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv"] Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.772172 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b875597f4-428nv"] Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.789997 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bd5d45fc8-5smfc"] Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.795755 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bd5d45fc8-5smfc"] Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.798081 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.973772 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:29:05 crc kubenswrapper[4849]: I0320 13:29:05.973834 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.019919 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.327388 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b88d74656-b6mvc"] Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.327859 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" containerName="controller-manager" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.327883 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" containerName="controller-manager" Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.327913 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3119af-6692-427f-9114-85cb697af99e" containerName="route-controller-manager" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.327924 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3119af-6692-427f-9114-85cb697af99e" containerName="route-controller-manager" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.328106 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" containerName="controller-manager" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.328130 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3119af-6692-427f-9114-85cb697af99e" containerName="route-controller-manager" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.328732 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.330529 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj"] Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.331413 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: W0320 13:29:06.335016 4849 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.335101 4849 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:29:06 crc kubenswrapper[4849]: W0320 13:29:06.335159 4849 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.335176 4849 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:29:06 crc kubenswrapper[4849]: W0320 13:29:06.336116 4849 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.336143 4849 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.336285 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.336607 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.336677 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.336729 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:29:06 crc kubenswrapper[4849]: W0320 13:29:06.336808 4849 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.336846 4849 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.337916 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.338557 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:29:06 crc kubenswrapper[4849]: W0320 13:29:06.344615 4849 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 20 13:29:06 crc kubenswrapper[4849]: E0320 13:29:06.344681 4849 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.344787 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.345117 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.356113 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b88d74656-b6mvc"] Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.365431 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj"] Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379465 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpn89\" (UniqueName: \"kubernetes.io/projected/0abdc0e6-5da7-4235-a390-83072a03a068-kube-api-access-vpn89\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379544 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhvs\" (UniqueName: \"kubernetes.io/projected/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-kube-api-access-fnhvs\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379591 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0abdc0e6-5da7-4235-a390-83072a03a068-client-ca\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379638 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-client-ca\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379674 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-config\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379712 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-serving-cert\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379738 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-proxy-ca-bundles\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379778 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abdc0e6-5da7-4235-a390-83072a03a068-config\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.379810 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0abdc0e6-5da7-4235-a390-83072a03a068-serving-cert\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481572 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpn89\" (UniqueName: \"kubernetes.io/projected/0abdc0e6-5da7-4235-a390-83072a03a068-kube-api-access-vpn89\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481638 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhvs\" (UniqueName: \"kubernetes.io/projected/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-kube-api-access-fnhvs\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481667 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0abdc0e6-5da7-4235-a390-83072a03a068-client-ca\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481696 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-client-ca\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481719 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-config\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481740 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-serving-cert\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481760 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-proxy-ca-bundles\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481792 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abdc0e6-5da7-4235-a390-83072a03a068-config\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.481809 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0abdc0e6-5da7-4235-a390-83072a03a068-serving-cert\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.482952 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0abdc0e6-5da7-4235-a390-83072a03a068-client-ca\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.482960 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-client-ca\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.483443 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abdc0e6-5da7-4235-a390-83072a03a068-config\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.501899 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-serving-cert\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.504596 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhvs\" (UniqueName: \"kubernetes.io/projected/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-kube-api-access-fnhvs\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.505836 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0abdc0e6-5da7-4235-a390-83072a03a068-serving-cert\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:06 crc kubenswrapper[4849]: I0320 13:29:06.781893 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.042450 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3119af-6692-427f-9114-85cb697af99e" path="/var/lib/kubelet/pods/9c3119af-6692-427f-9114-85cb697af99e/volumes" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.043242 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3" path="/var/lib/kubelet/pods/a9fd9870-af59-4cfc-aab0-7a1a7ed2a8a3/volumes" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.064000 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zspj"] Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.377863 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.384379 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-config\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:07 crc kubenswrapper[4849]: E0320 13:29:07.483570 4849 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:29:07 crc kubenswrapper[4849]: E0320 13:29:07.483763 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-proxy-ca-bundles podName:866e8c08-10e3-495e-8cb5-3ba2e7c6d006 nodeName:}" failed. No retries permitted until 2026-03-20 13:29:07.983724698 +0000 UTC m=+297.661448103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-proxy-ca-bundles") pod "controller-manager-6b88d74656-b6mvc" (UID: "866e8c08-10e3-495e-8cb5-3ba2e7c6d006") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:29:07 crc kubenswrapper[4849]: E0320 13:29:07.504911 4849 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:29:07 crc kubenswrapper[4849]: E0320 13:29:07.504993 4849 projected.go:194] Error preparing data for projected volume kube-api-access-vpn89 for pod openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:29:07 crc kubenswrapper[4849]: E0320 13:29:07.505087 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0abdc0e6-5da7-4235-a390-83072a03a068-kube-api-access-vpn89 podName:0abdc0e6-5da7-4235-a390-83072a03a068 nodeName:}" failed. No retries permitted until 2026-03-20 13:29:08.005062156 +0000 UTC m=+297.682785551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vpn89" (UniqueName: "kubernetes.io/projected/0abdc0e6-5da7-4235-a390-83072a03a068-kube-api-access-vpn89") pod "route-controller-manager-5b855f9785-chvsj" (UID: "0abdc0e6-5da7-4235-a390-83072a03a068") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.523423 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.638557 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.696698 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.699761 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.917690 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.917756 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:29:07 crc kubenswrapper[4849]: I0320 13:29:07.964934 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.009894 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpn89\" (UniqueName: \"kubernetes.io/projected/0abdc0e6-5da7-4235-a390-83072a03a068-kube-api-access-vpn89\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.010007 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-proxy-ca-bundles\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.011141 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/866e8c08-10e3-495e-8cb5-3ba2e7c6d006-proxy-ca-bundles\") pod \"controller-manager-6b88d74656-b6mvc\" (UID: \"866e8c08-10e3-495e-8cb5-3ba2e7c6d006\") " pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.016212 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpn89\" (UniqueName: \"kubernetes.io/projected/0abdc0e6-5da7-4235-a390-83072a03a068-kube-api-access-vpn89\") pod \"route-controller-manager-5b855f9785-chvsj\" (UID: \"0abdc0e6-5da7-4235-a390-83072a03a068\") " pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.156953 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.160860 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.160918 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.167664 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.208998 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.491898 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj"] Mar 20 13:29:08 crc kubenswrapper[4849]: W0320 13:29:08.500537 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0abdc0e6_5da7_4235_a390_83072a03a068.slice/crio-97ed13ce5e09e9b27ba0164c02a2491a60e6416673e16db918a3de1e89c2244d WatchSource:0}: Error finding container 97ed13ce5e09e9b27ba0164c02a2491a60e6416673e16db918a3de1e89c2244d: Status 404 returned error can't find the container with id 97ed13ce5e09e9b27ba0164c02a2491a60e6416673e16db918a3de1e89c2244d Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.619538 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b88d74656-b6mvc"] Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.747277 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" event={"ID":"0abdc0e6-5da7-4235-a390-83072a03a068","Type":"ContainerStarted","Data":"0d324f6b62925c996663df9a2fc341069ee29abff32363cdc4c578a2d50756dd"} Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.747325 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" event={"ID":"0abdc0e6-5da7-4235-a390-83072a03a068","Type":"ContainerStarted","Data":"97ed13ce5e09e9b27ba0164c02a2491a60e6416673e16db918a3de1e89c2244d"} Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.747676 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.749142 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" event={"ID":"866e8c08-10e3-495e-8cb5-3ba2e7c6d006","Type":"ContainerStarted","Data":"93ca491f9d53123945d0be6f3cb961d599516e8339b99dadce26f074e77bc8cf"} Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.749305 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5zspj" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="registry-server" containerID="cri-o://989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986" gracePeriod=2 Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.751803 4849 patch_prober.go:28] interesting pod/route-controller-manager-5b855f9785-chvsj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" start-of-body= Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.751860 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" podUID="0abdc0e6-5da7-4235-a390-83072a03a068" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.70:8443/healthz\": dial tcp 10.217.0.70:8443: connect: connection refused" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.767411 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" podStartSLOduration=4.767385254 podStartE2EDuration="4.767385254s" podCreationTimestamp="2026-03-20 13:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:29:08.764487184 +0000 UTC m=+298.442210599" watchObservedRunningTime="2026-03-20 13:29:08.767385254 +0000 UTC m=+298.445108639" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.798413 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.816619 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.863357 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:29:08 crc kubenswrapper[4849]: I0320 13:29:08.863413 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.200294 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.268618 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.268688 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.327160 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf4g5\" (UniqueName: \"kubernetes.io/projected/5e607d4a-4c18-4de3-9b29-c5f32fadee50-kube-api-access-xf4g5\") pod \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.327520 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-catalog-content\") pod \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.327670 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-utilities\") pod \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\" (UID: \"5e607d4a-4c18-4de3-9b29-c5f32fadee50\") " Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.328426 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-utilities" (OuterVolumeSpecName: "utilities") pod "5e607d4a-4c18-4de3-9b29-c5f32fadee50" (UID: "5e607d4a-4c18-4de3-9b29-c5f32fadee50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.348036 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e607d4a-4c18-4de3-9b29-c5f32fadee50-kube-api-access-xf4g5" (OuterVolumeSpecName: "kube-api-access-xf4g5") pod "5e607d4a-4c18-4de3-9b29-c5f32fadee50" (UID: "5e607d4a-4c18-4de3-9b29-c5f32fadee50"). InnerVolumeSpecName "kube-api-access-xf4g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.394170 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e607d4a-4c18-4de3-9b29-c5f32fadee50" (UID: "5e607d4a-4c18-4de3-9b29-c5f32fadee50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.429417 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf4g5\" (UniqueName: \"kubernetes.io/projected/5e607d4a-4c18-4de3-9b29-c5f32fadee50-kube-api-access-xf4g5\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.429453 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.429461 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e607d4a-4c18-4de3-9b29-c5f32fadee50-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.766339 4849 generic.go:334] "Generic (PLEG): container finished" podID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerID="989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986" exitCode=0 Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.766390 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5zspj" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.766428 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerDied","Data":"989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986"} Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.766504 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5zspj" event={"ID":"5e607d4a-4c18-4de3-9b29-c5f32fadee50","Type":"ContainerDied","Data":"4eea9cb47fa7c872efd7b16ed772ab07dde18f98704865ab4f053c8a7a66a3ed"} Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.766532 4849 scope.go:117] "RemoveContainer" containerID="989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.768437 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" event={"ID":"866e8c08-10e3-495e-8cb5-3ba2e7c6d006","Type":"ContainerStarted","Data":"b356588ebc772ffd84f61455a067c843c3b52d4c49323b9db0737e36033d8546"} Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.768961 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.772897 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b855f9785-chvsj" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.776549 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.785446 4849 scope.go:117] "RemoveContainer" containerID="561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.798765 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b88d74656-b6mvc" podStartSLOduration=5.798741188 podStartE2EDuration="5.798741188s" podCreationTimestamp="2026-03-20 13:29:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:29:09.792322682 +0000 UTC m=+299.470046077" watchObservedRunningTime="2026-03-20 13:29:09.798741188 +0000 UTC m=+299.476464583" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.823506 4849 scope.go:117] "RemoveContainer" containerID="1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.848812 4849 scope.go:117] "RemoveContainer" containerID="989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.850296 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5zspj"] Mar 20 13:29:09 crc kubenswrapper[4849]: E0320 13:29:09.851147 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986\": container with ID starting with 989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986 not found: ID does not exist" containerID="989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.851192 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986"} err="failed to get container status \"989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986\": rpc error: code = NotFound desc = could not find container \"989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986\": container with ID starting with 989dd9387ccee2c71f5e98851eaca4b06c1384e7d32351935029f7a038cea986 not found: ID does not exist" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.851221 4849 scope.go:117] "RemoveContainer" containerID="561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193" Mar 20 13:29:09 crc kubenswrapper[4849]: E0320 13:29:09.851827 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193\": container with ID starting with 561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193 not found: ID does not exist" containerID="561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.851890 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193"} err="failed to get container status \"561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193\": rpc error: code = NotFound desc = could not find container \"561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193\": container with ID starting with 561fc02ede680528f9a1a9529b67aa18a073f003664871f0a594d5464b818193 not found: ID does not exist" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.851929 4849 scope.go:117] "RemoveContainer" containerID="1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a" Mar 20 13:29:09 crc kubenswrapper[4849]: E0320 13:29:09.852424 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a\": container with ID starting with 1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a not found: ID does not exist" containerID="1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.852465 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a"} err="failed to get container status \"1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a\": rpc error: code = NotFound desc = could not find container \"1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a\": container with ID starting with 1146ecca42a20ab0377bd94a8a0f451257a2a25a81ec7e1990cd4d30e14b878a not found: ID does not exist" Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.856139 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5zspj"] Mar 20 13:29:09 crc kubenswrapper[4849]: I0320 13:29:09.908831 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lnk65" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="registry-server" probeResult="failure" output=< Mar 20 13:29:09 crc kubenswrapper[4849]: timeout: failed to connect service ":50051" within 1s Mar 20 13:29:09 crc kubenswrapper[4849]: > Mar 20 13:29:10 crc kubenswrapper[4849]: I0320 13:29:10.265256 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zwxp"] Mar 20 13:29:10 crc kubenswrapper[4849]: I0320 13:29:10.326134 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v2fkf" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="registry-server" probeResult="failure" output=< Mar 20 13:29:10 crc kubenswrapper[4849]: timeout: failed to connect service ":50051" within 1s Mar 20 13:29:10 crc kubenswrapper[4849]: > Mar 20 13:29:10 crc kubenswrapper[4849]: I0320 13:29:10.775382 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6zwxp" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="registry-server" containerID="cri-o://c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7" gracePeriod=2 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.046065 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" path="/var/lib/kubelet/pods/5e607d4a-4c18-4de3-9b29-c5f32fadee50/volumes" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.059770 4849 info.go:109] Failed to get network devices: open /sys/class/net/c7e51cbed536940/address: no such file or directory Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.203895 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.267143 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-catalog-content\") pod \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.267196 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-utilities\") pod \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.267312 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbb4d\" (UniqueName: \"kubernetes.io/projected/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-kube-api-access-nbb4d\") pod \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\" (UID: \"ee7ffb06-f91c-4469-9c5d-ee0a4296c805\") " Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.268293 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-utilities" (OuterVolumeSpecName: "utilities") pod "ee7ffb06-f91c-4469-9c5d-ee0a4296c805" (UID: "ee7ffb06-f91c-4469-9c5d-ee0a4296c805"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.273241 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-kube-api-access-nbb4d" (OuterVolumeSpecName: "kube-api-access-nbb4d") pod "ee7ffb06-f91c-4469-9c5d-ee0a4296c805" (UID: "ee7ffb06-f91c-4469-9c5d-ee0a4296c805"). InnerVolumeSpecName "kube-api-access-nbb4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.300370 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee7ffb06-f91c-4469-9c5d-ee0a4296c805" (UID: "ee7ffb06-f91c-4469-9c5d-ee0a4296c805"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.369147 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.369193 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.369207 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbb4d\" (UniqueName: \"kubernetes.io/projected/ee7ffb06-f91c-4469-9c5d-ee0a4296c805-kube-api-access-nbb4d\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.615887 4849 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.616240 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="registry-server" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616253 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="registry-server" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.616265 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="extract-content" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616274 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="extract-content" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.616302 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="extract-utilities" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616311 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="extract-utilities" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.616328 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="extract-utilities" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616336 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="extract-utilities" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.616347 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="registry-server" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616354 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="registry-server" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.616384 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="extract-content" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616393 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="extract-content" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616517 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e607d4a-4c18-4de3-9b29-c5f32fadee50" containerName="registry-server" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616543 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerName="registry-server" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.616977 4849 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.617109 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.617210 4849 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.617812 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6" gracePeriod=15 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.617918 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626" gracePeriod=15 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.617942 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d" gracePeriod=15 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.618019 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5" gracePeriod=15 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.617943 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28" gracePeriod=15 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619351 4849 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619606 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619621 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619659 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619668 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619677 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619685 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619698 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619705 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619783 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619794 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619803 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619847 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619861 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619869 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.619882 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.619889 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620059 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620071 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620079 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620088 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620098 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620110 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620120 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.620250 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620262 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.620274 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620281 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620398 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.620632 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.673467 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.673908 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.673931 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.674033 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.674216 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.674267 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.674327 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.674356 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775284 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775344 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775366 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775410 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775415 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775471 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775432 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775495 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775515 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775526 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775557 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775558 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775577 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775598 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775640 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.775735 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.784338 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.786914 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.787762 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d" exitCode=0 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.787801 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626" exitCode=0 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.787832 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5" exitCode=0 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.787849 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28" exitCode=2 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.787935 4849 scope.go:117] "RemoveContainer" containerID="f4fa63af7903b54cf0b79d06f183a96c128a1c39b2759233378bb6fce5a6d4a9" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.791348 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" containerID="c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7" exitCode=0 Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.791423 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zwxp" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.791455 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerDied","Data":"c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7"} Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.791516 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zwxp" event={"ID":"ee7ffb06-f91c-4469-9c5d-ee0a4296c805","Type":"ContainerDied","Data":"c7e51cbed536940c4857fd57ff7693bd7ec380c83bebce4e4bf8c514f2cfaa5d"} Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.792722 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.793724 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.806489 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.807201 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.818101 4849 scope.go:117] "RemoveContainer" containerID="c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.903688 4849 scope.go:117] "RemoveContainer" containerID="e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.920861 4849 scope.go:117] "RemoveContainer" containerID="ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.937188 4849 scope.go:117] "RemoveContainer" containerID="c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.937922 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7\": container with ID starting with c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7 not found: ID does not exist" containerID="c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.938017 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7"} err="failed to get container status \"c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7\": rpc error: code = NotFound desc = could not find container \"c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7\": container with ID starting with c4482be04844b4e8535b80fca839cef92d86adc56c29cda7c241029c452149a7 not found: ID does not exist" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.938057 4849 scope.go:117] "RemoveContainer" containerID="e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.938541 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935\": container with ID starting with e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935 not found: ID does not exist" containerID="e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.938570 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935"} err="failed to get container status \"e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935\": rpc error: code = NotFound desc = could not find container \"e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935\": container with ID starting with e7e247823c9fc75e7a5fd8670e8d6e5dd6965895b26df09988a3db2ee34db935 not found: ID does not exist" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.938587 4849 scope.go:117] "RemoveContainer" containerID="ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe" Mar 20 13:29:11 crc kubenswrapper[4849]: E0320 13:29:11.939046 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe\": container with ID starting with ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe not found: ID does not exist" containerID="ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe" Mar 20 13:29:11 crc kubenswrapper[4849]: I0320 13:29:11.939110 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe"} err="failed to get container status \"ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe\": rpc error: code = NotFound desc = could not find container \"ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe\": container with ID starting with ae26d1ee9ebee5c6dff72ec66adf0a76c866c730ab234832944fc7ebdf1d5bbe not found: ID does not exist" Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.351211 4849 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.351298 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.820842 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.824067 4849 generic.go:334] "Generic (PLEG): container finished" podID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" containerID="c8532dabc515123c220d97782867f7f0fac3616731f86a008e6a9add465f27f3" exitCode=0 Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.824113 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e5ebd6cd-36b8-4c55-a33f-442885c800c3","Type":"ContainerDied","Data":"c8532dabc515123c220d97782867f7f0fac3616731f86a008e6a9add465f27f3"} Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.825229 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.825671 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:12 crc kubenswrapper[4849]: I0320 13:29:12.826229 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:13 crc kubenswrapper[4849]: I0320 13:29:13.992409 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:29:13 crc kubenswrapper[4849]: I0320 13:29:13.993656 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:13 crc kubenswrapper[4849]: I0320 13:29:13.994714 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:13 crc kubenswrapper[4849]: I0320 13:29:13.994995 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:13 crc kubenswrapper[4849]: I0320 13:29:13.995187 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.112997 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.113362 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.113702 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.113795 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.113905 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.114004 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.114610 4849 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.114639 4849 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.114652 4849 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.203565 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.204251 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.204740 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.205731 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.316947 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kube-api-access\") pod \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.317045 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kubelet-dir\") pod \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.317107 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-var-lock\") pod \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\" (UID: \"e5ebd6cd-36b8-4c55-a33f-442885c800c3\") " Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.317121 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e5ebd6cd-36b8-4c55-a33f-442885c800c3" (UID: "e5ebd6cd-36b8-4c55-a33f-442885c800c3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.317177 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-var-lock" (OuterVolumeSpecName: "var-lock") pod "e5ebd6cd-36b8-4c55-a33f-442885c800c3" (UID: "e5ebd6cd-36b8-4c55-a33f-442885c800c3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.317383 4849 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.317400 4849 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.323513 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e5ebd6cd-36b8-4c55-a33f-442885c800c3" (UID: "e5ebd6cd-36b8-4c55-a33f-442885c800c3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.418945 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5ebd6cd-36b8-4c55-a33f-442885c800c3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.848891 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.849772 4849 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6" exitCode=0 Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.849885 4849 scope.go:117] "RemoveContainer" containerID="2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.849890 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.852159 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e5ebd6cd-36b8-4c55-a33f-442885c800c3","Type":"ContainerDied","Data":"0a4bf5a304d9ac624cf7eb5afefcd5235f5ebc5ca8f684e87b72a73c39c6e0e8"} Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.852214 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4bf5a304d9ac624cf7eb5afefcd5235f5ebc5ca8f684e87b72a73c39c6e0e8" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.852173 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.867697 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.868176 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.868456 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.871640 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.872097 4849 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.872144 4849 scope.go:117] "RemoveContainer" containerID="754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.872402 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.891195 4849 scope.go:117] "RemoveContainer" containerID="f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.908064 4849 scope.go:117] "RemoveContainer" containerID="8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.930687 4849 scope.go:117] "RemoveContainer" containerID="a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.951103 4849 scope.go:117] "RemoveContainer" containerID="5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.980354 4849 scope.go:117] "RemoveContainer" containerID="2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d" Mar 20 13:29:14 crc kubenswrapper[4849]: E0320 13:29:14.980899 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\": container with ID starting with 2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d not found: ID does not exist" containerID="2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.980993 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d"} err="failed to get container status \"2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\": rpc error: code = NotFound desc = could not find container \"2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d\": container with ID starting with 2722e5da1378374dc232ff556392420943816eeb132ad36155e14f493dcf8d4d not found: ID does not exist" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.981022 4849 scope.go:117] "RemoveContainer" containerID="754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626" Mar 20 13:29:14 crc kubenswrapper[4849]: E0320 13:29:14.981512 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\": container with ID starting with 754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626 not found: ID does not exist" containerID="754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.981565 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626"} err="failed to get container status \"754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\": rpc error: code = NotFound desc = could not find container \"754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626\": container with ID starting with 754b0a7e461101feb98ca9e1ab020e0ae1e41ee6c80d0eb212210bed1ed48626 not found: ID does not exist" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.981594 4849 scope.go:117] "RemoveContainer" containerID="f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5" Mar 20 13:29:14 crc kubenswrapper[4849]: E0320 13:29:14.982354 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\": container with ID starting with f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5 not found: ID does not exist" containerID="f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.982423 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5"} err="failed to get container status \"f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\": rpc error: code = NotFound desc = could not find container \"f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5\": container with ID starting with f8277347bb17110efd0a7806089766468e964bea98a1c36d51ddf30b713985c5 not found: ID does not exist" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.982461 4849 scope.go:117] "RemoveContainer" containerID="8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28" Mar 20 13:29:14 crc kubenswrapper[4849]: E0320 13:29:14.982998 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\": container with ID starting with 8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28 not found: ID does not exist" containerID="8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.983027 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28"} err="failed to get container status \"8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\": rpc error: code = NotFound desc = could not find container \"8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28\": container with ID starting with 8f446c917a4989429be25c6b262188364556e57c0c6ee31d1b11d4db76741d28 not found: ID does not exist" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.983043 4849 scope.go:117] "RemoveContainer" containerID="a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6" Mar 20 13:29:14 crc kubenswrapper[4849]: E0320 13:29:14.983423 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\": container with ID starting with a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6 not found: ID does not exist" containerID="a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.983452 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6"} err="failed to get container status \"a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\": rpc error: code = NotFound desc = could not find container \"a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6\": container with ID starting with a68f83a225d722d073ceb756ee404aba43369b9f75602d33b097a781a90559b6 not found: ID does not exist" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.983469 4849 scope.go:117] "RemoveContainer" containerID="5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1" Mar 20 13:29:14 crc kubenswrapper[4849]: E0320 13:29:14.984006 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\": container with ID starting with 5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1 not found: ID does not exist" containerID="5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1" Mar 20 13:29:14 crc kubenswrapper[4849]: I0320 13:29:14.984029 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1"} err="failed to get container status \"5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\": rpc error: code = NotFound desc = could not find container \"5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1\": container with ID starting with 5454a982758eeb19e3a56ecbf6c9acbeb5d6b06367f65dfc680a906a423af7a1 not found: ID does not exist" Mar 20 13:29:15 crc kubenswrapper[4849]: I0320 13:29:15.044151 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.649328 4849 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.650329 4849 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.650849 4849 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.651162 4849 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.651490 4849 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:15 crc kubenswrapper[4849]: I0320 13:29:15.651525 4849 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.651852 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Mar 20 13:29:15 crc kubenswrapper[4849]: E0320 13:29:15.853962 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Mar 20 13:29:16 crc kubenswrapper[4849]: E0320 13:29:16.255514 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Mar 20 13:29:16 crc kubenswrapper[4849]: E0320 13:29:16.663741 4849 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:16 crc kubenswrapper[4849]: I0320 13:29:16.664320 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:16 crc kubenswrapper[4849]: W0320 13:29:16.687531 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-e0ac3f3d63d1243ff84a8142039f1d5890846e0601e9f8a883bc9677748af8cb WatchSource:0}: Error finding container e0ac3f3d63d1243ff84a8142039f1d5890846e0601e9f8a883bc9677748af8cb: Status 404 returned error can't find the container with id e0ac3f3d63d1243ff84a8142039f1d5890846e0601e9f8a883bc9677748af8cb Mar 20 13:29:16 crc kubenswrapper[4849]: E0320 13:29:16.691224 4849 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e8fbfb0564180 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:29:16.690522496 +0000 UTC m=+306.368245901,LastTimestamp:2026-03-20 13:29:16.690522496 +0000 UTC m=+306.368245901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:29:16 crc kubenswrapper[4849]: I0320 13:29:16.867928 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e0ac3f3d63d1243ff84a8142039f1d5890846e0601e9f8a883bc9677748af8cb"} Mar 20 13:29:17 crc kubenswrapper[4849]: E0320 13:29:17.056250 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Mar 20 13:29:17 crc kubenswrapper[4849]: I0320 13:29:17.875115 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d"} Mar 20 13:29:17 crc kubenswrapper[4849]: E0320 13:29:17.875730 4849 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:17 crc kubenswrapper[4849]: I0320 13:29:17.875778 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:17 crc kubenswrapper[4849]: I0320 13:29:17.876109 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4849]: E0320 13:29:18.657663 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Mar 20 13:29:18 crc kubenswrapper[4849]: E0320 13:29:18.881239 4849 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.900542 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.902072 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.902628 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.903383 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.938601 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.939406 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.940168 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4849]: I0320 13:29:18.940746 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.313986 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.315067 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.315426 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.315696 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.316052 4849 status_manager.go:851] "Failed to get status for pod" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" pod="openshift-marketplace/redhat-operators-v2fkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v2fkf\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.355043 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.355937 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.356333 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.356953 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:19 crc kubenswrapper[4849]: I0320 13:29:19.357407 4849 status_manager.go:851] "Failed to get status for pod" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" pod="openshift-marketplace/redhat-operators-v2fkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v2fkf\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:21 crc kubenswrapper[4849]: I0320 13:29:21.041585 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:21 crc kubenswrapper[4849]: I0320 13:29:21.042526 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:21 crc kubenswrapper[4849]: I0320 13:29:21.042960 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:21 crc kubenswrapper[4849]: I0320 13:29:21.043122 4849 status_manager.go:851] "Failed to get status for pod" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" pod="openshift-marketplace/redhat-operators-v2fkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v2fkf\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:21 crc kubenswrapper[4849]: E0320 13:29:21.081510 4849 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" volumeName="registry-storage" Mar 20 13:29:21 crc kubenswrapper[4849]: E0320 13:29:21.858882 4849 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="6.4s" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.034846 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.036963 4849 status_manager.go:851] "Failed to get status for pod" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" pod="openshift-marketplace/redhat-operators-v2fkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v2fkf\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.037772 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.038173 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.038543 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.054151 4849 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.054190 4849 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:24 crc kubenswrapper[4849]: E0320 13:29:24.054621 4849 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.055210 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:24 crc kubenswrapper[4849]: W0320 13:29:24.081891 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-72d3f491be002a928da56b76b60fe25c8d8d45d04801378c27853373b2b83073 WatchSource:0}: Error finding container 72d3f491be002a928da56b76b60fe25c8d8d45d04801378c27853373b2b83073: Status 404 returned error can't find the container with id 72d3f491be002a928da56b76b60fe25c8d8d45d04801378c27853373b2b83073 Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.920442 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.921789 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.921897 4849 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a" exitCode=1 Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.922035 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a"} Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.923069 4849 scope.go:117] "RemoveContainer" containerID="022f003349c9db5d8c3b128c7ba11188adad41ec017f008553beb51810a99f2a" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.923256 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.923814 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.924723 4849 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d3a03640e170f858dc77d0ed98dcc5817b488f2b25394bff9fd17f37f5312f54" exitCode=0 Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.924770 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d3a03640e170f858dc77d0ed98dcc5817b488f2b25394bff9fd17f37f5312f54"} Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.924763 4849 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.924878 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"72d3f491be002a928da56b76b60fe25c8d8d45d04801378c27853373b2b83073"} Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.925341 4849 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.925385 4849 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.925660 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: E0320 13:29:24.925918 4849 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.926532 4849 status_manager.go:851] "Failed to get status for pod" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" pod="openshift-marketplace/redhat-operators-v2fkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v2fkf\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.927271 4849 status_manager.go:851] "Failed to get status for pod" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" pod="openshift-marketplace/redhat-operators-lnk65" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lnk65\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.928085 4849 status_manager.go:851] "Failed to get status for pod" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" pod="openshift-marketplace/redhat-marketplace-6zwxp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6zwxp\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.928529 4849 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.929072 4849 status_manager.go:851] "Failed to get status for pod" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:24 crc kubenswrapper[4849]: I0320 13:29:24.929778 4849 status_manager.go:851] "Failed to get status for pod" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" pod="openshift-marketplace/redhat-operators-v2fkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-v2fkf\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.939020 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"50c05f7a1b3042c610ab79dcd6f15faea377b60372298d6a341348e5662d577b"} Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.939092 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f2864a38078562cda2291e8915ab559042058af76e09f5b0c3dd002dd53795d5"} Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.939104 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"aaeaaf55ae7432c05ad5e42c982013ab373edac5f349c28234dc124659354b3d"} Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.939114 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8359860bbd18d24c2fb8f355b9ebf11a76436e560aadebf658811dc05e1729a1"} Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.945337 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.945928 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:29:25 crc kubenswrapper[4849]: I0320 13:29:25.945975 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6e5a849122f8fb58886497c8e23e52665ea00e09a226f1047c3dff1405f46cd2"} Mar 20 13:29:26 crc kubenswrapper[4849]: I0320 13:29:26.957879 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b06bee67c3cc178ff5cf2d81814fda30cdd5dac2fe3d33355623d200bfdcf758"} Mar 20 13:29:26 crc kubenswrapper[4849]: I0320 13:29:26.958579 4849 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:26 crc kubenswrapper[4849]: I0320 13:29:26.958596 4849 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:26 crc kubenswrapper[4849]: I0320 13:29:26.958800 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:29 crc kubenswrapper[4849]: I0320 13:29:29.055632 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:29 crc kubenswrapper[4849]: I0320 13:29:29.057205 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:29 crc kubenswrapper[4849]: I0320 13:29:29.063990 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:31 crc kubenswrapper[4849]: I0320 13:29:31.982434 4849 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:32 crc kubenswrapper[4849]: I0320 13:29:32.243498 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:29:32 crc kubenswrapper[4849]: I0320 13:29:32.990890 4849 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:32 crc kubenswrapper[4849]: I0320 13:29:32.991439 4849 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:32 crc kubenswrapper[4849]: I0320 13:29:32.994325 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:32 crc kubenswrapper[4849]: I0320 13:29:32.997396 4849 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="25d0f9b1-844f-46ec-8f6c-195060f50ffa" Mar 20 13:29:33 crc kubenswrapper[4849]: I0320 13:29:33.218797 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:29:33 crc kubenswrapper[4849]: I0320 13:29:33.222101 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:29:33 crc kubenswrapper[4849]: I0320 13:29:33.998123 4849 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:33 crc kubenswrapper[4849]: I0320 13:29:33.998789 4849 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="55d45a10-c0f3-44bd-b133-ff8a72a02483" Mar 20 13:29:41 crc kubenswrapper[4849]: I0320 13:29:41.060542 4849 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="25d0f9b1-844f-46ec-8f6c-195060f50ffa" Mar 20 13:29:41 crc kubenswrapper[4849]: I0320 13:29:41.280188 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:29:42 crc kubenswrapper[4849]: I0320 13:29:42.207305 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:29:42 crc kubenswrapper[4849]: I0320 13:29:42.246712 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:29:42 crc kubenswrapper[4849]: I0320 13:29:42.249598 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:29:42 crc kubenswrapper[4849]: I0320 13:29:42.662608 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:29:42 crc kubenswrapper[4849]: I0320 13:29:42.668103 4849 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:29:42 crc kubenswrapper[4849]: I0320 13:29:42.969936 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.143425 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.164954 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.329834 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.493071 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.514399 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.580056 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:29:43 crc kubenswrapper[4849]: I0320 13:29:43.774708 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.008048 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.031614 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.182623 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.251389 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.317650 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.546176 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.565322 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.675157 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.897086 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.898026 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:29:44 crc kubenswrapper[4849]: I0320 13:29:44.979738 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.184330 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.274753 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.469252 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.471458 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.617332 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.656790 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.740524 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:29:45 crc kubenswrapper[4849]: I0320 13:29:45.834170 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.023302 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.219424 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.235762 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.491600 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.601316 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.612697 4849 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.640413 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.678085 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.700496 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.722722 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.731833 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.759785 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.784665 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.896086 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:29:46 crc kubenswrapper[4849]: I0320 13:29:46.994578 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.062229 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.171609 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.323960 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.389518 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.390184 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.435061 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.448489 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.463729 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.468472 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.498070 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.510864 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.517636 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.586514 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.789905 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.841261 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.930679 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.935653 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:29:47 crc kubenswrapper[4849]: I0320 13:29:47.959450 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.004767 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.007997 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.028600 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.093261 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.141077 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.189490 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.265107 4849 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.312789 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.328059 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.405249 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.528945 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.529231 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.635160 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.661200 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.696584 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.805050 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.806040 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.809935 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:29:48 crc kubenswrapper[4849]: I0320 13:29:48.866973 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.012073 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.027149 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.048263 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.095100 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.153067 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.196091 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.214889 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.277781 4849 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.278940 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.434867 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.464216 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.509197 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.521674 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.542749 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.641266 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.664001 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.694350 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.719630 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.758309 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.819639 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.858659 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.915548 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.928536 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.953767 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.970538 4849 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.976015 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/redhat-marketplace-6zwxp"] Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.976097 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.980552 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:29:49 crc kubenswrapper[4849]: I0320 13:29:49.997411 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.997394255 podStartE2EDuration="18.997394255s" podCreationTimestamp="2026-03-20 13:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:29:49.991461761 +0000 UTC m=+339.669185176" watchObservedRunningTime="2026-03-20 13:29:49.997394255 +0000 UTC m=+339.675117650" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.006500 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.051115 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.081444 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.084314 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.148493 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.161795 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.221405 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.239699 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.279693 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.280318 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.280974 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.298093 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.305511 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.575299 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.653807 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.706626 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.873909 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.892108 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.900835 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:29:50 crc kubenswrapper[4849]: I0320 13:29:50.911885 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.043526 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee7ffb06-f91c-4469-9c5d-ee0a4296c805" path="/var/lib/kubelet/pods/ee7ffb06-f91c-4469-9c5d-ee0a4296c805/volumes" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.083163 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.216093 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.283050 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.352354 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.454377 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.467666 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.482977 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.513763 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.514383 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.694208 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.798431 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.827588 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.853140 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.939767 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:29:51 crc kubenswrapper[4849]: I0320 13:29:51.944469 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.007916 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.008047 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.200574 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.219738 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.240870 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.293155 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.310552 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.327482 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.380682 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.401097 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.453478 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.459789 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.506892 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.543055 4849 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.661782 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.684324 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.698250 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.716391 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.825965 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.827527 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:29:52 crc kubenswrapper[4849]: I0320 13:29:52.961939 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.155757 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.182887 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.211424 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.229081 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.294550 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.354312 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.377637 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.396357 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.461007 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.490892 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.494250 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.537647 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.559386 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.564716 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.891661 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.947467 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:29:53 crc kubenswrapper[4849]: I0320 13:29:53.964114 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.018430 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.052374 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.171051 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.183961 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.189054 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.224154 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.248172 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.267422 4849 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.267739 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d" gracePeriod=5 Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.291162 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.305246 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.373222 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.448540 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.544356 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.603780 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.639908 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.689372 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.710033 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.739846 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.755881 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.898046 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.939486 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:29:54 crc kubenswrapper[4849]: I0320 13:29:54.974868 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.007887 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.016238 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.078605 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.156545 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.213809 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.216731 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.270752 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.309282 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.368559 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.430433 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.440793 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.458347 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.499252 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.509501 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.683949 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.684410 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.703858 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.782787 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.803280 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.839134 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:29:55 crc kubenswrapper[4849]: I0320 13:29:55.937289 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.012465 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.280389 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.388172 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.460453 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.546457 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.556984 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.886235 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:29:56 crc kubenswrapper[4849]: I0320 13:29:56.927378 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:29:57 crc kubenswrapper[4849]: I0320 13:29:57.227750 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:29:57 crc kubenswrapper[4849]: I0320 13:29:57.321391 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:29:57 crc kubenswrapper[4849]: I0320 13:29:57.354493 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:29:57 crc kubenswrapper[4849]: I0320 13:29:57.358015 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:29:57 crc kubenswrapper[4849]: I0320 13:29:57.579966 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:29:57 crc kubenswrapper[4849]: I0320 13:29:57.886147 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.084166 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.104981 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.125550 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.180165 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.355799 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.441036 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.536878 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.693151 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.695977 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.710292 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:29:58 crc kubenswrapper[4849]: I0320 13:29:58.748279 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.107369 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.216594 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.767709 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.854723 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.854845 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964069 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964088 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964239 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964296 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964339 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964472 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964505 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964481 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964602 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.964985 4849 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.965366 4849 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.965384 4849 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.965399 4849 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 13:29:59 crc kubenswrapper[4849]: I0320 13:29:59.974467 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.067341 4849 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.160698 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.160746 4849 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d" exitCode=137 Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.160792 4849 scope.go:117] "RemoveContainer" containerID="bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.160832 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.165211 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566890-zrc5b"] Mar 20 13:30:00 crc kubenswrapper[4849]: E0320 13:30:00.165482 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" containerName="installer" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.165502 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" containerName="installer" Mar 20 13:30:00 crc kubenswrapper[4849]: E0320 13:30:00.165516 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.165525 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.165653 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.165674 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5ebd6cd-36b8-4c55-a33f-442885c800c3" containerName="installer" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.166161 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.169019 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.169965 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.170404 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.173909 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld"] Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.174697 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.176373 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.176711 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.182578 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-zrc5b"] Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.187847 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld"] Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.188326 4849 scope.go:117] "RemoveContainer" containerID="bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d" Mar 20 13:30:00 crc kubenswrapper[4849]: E0320 13:30:00.188865 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d\": container with ID starting with bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d not found: ID does not exist" containerID="bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.188920 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d"} err="failed to get container status \"bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d\": rpc error: code = NotFound desc = could not find container \"bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d\": container with ID starting with bc8c8c7b8792c1a4e0787e8265fce6b0f9ba8a22a5cc296a6a4519130d3d679d not found: ID does not exist" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.270260 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-secret-volume\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.270320 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-config-volume\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.270418 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqch\" (UniqueName: \"kubernetes.io/projected/d5ad70d6-39b4-4488-99a2-34b33c249a5a-kube-api-access-pzqch\") pod \"auto-csr-approver-29566890-zrc5b\" (UID: \"d5ad70d6-39b4-4488-99a2-34b33c249a5a\") " pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.270446 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxjx\" (UniqueName: \"kubernetes.io/projected/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-kube-api-access-7xxjx\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.371240 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-secret-volume\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.371299 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-config-volume\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.371354 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqch\" (UniqueName: \"kubernetes.io/projected/d5ad70d6-39b4-4488-99a2-34b33c249a5a-kube-api-access-pzqch\") pod \"auto-csr-approver-29566890-zrc5b\" (UID: \"d5ad70d6-39b4-4488-99a2-34b33c249a5a\") " pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.371374 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxjx\" (UniqueName: \"kubernetes.io/projected/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-kube-api-access-7xxjx\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.372446 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-config-volume\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.380205 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-secret-volume\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.388329 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqch\" (UniqueName: \"kubernetes.io/projected/d5ad70d6-39b4-4488-99a2-34b33c249a5a-kube-api-access-pzqch\") pod \"auto-csr-approver-29566890-zrc5b\" (UID: \"d5ad70d6-39b4-4488-99a2-34b33c249a5a\") " pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.390804 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxjx\" (UniqueName: \"kubernetes.io/projected/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-kube-api-access-7xxjx\") pod \"collect-profiles-29566890-mhpld\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.498936 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.508096 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.891842 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-zrc5b"] Mar 20 13:30:00 crc kubenswrapper[4849]: I0320 13:30:00.941925 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld"] Mar 20 13:30:00 crc kubenswrapper[4849]: W0320 13:30:00.948163 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c18dd6_0aa5_4ead_84eb_830f2c08c2f6.slice/crio-bffa9a40aa8113d627f683c345fb4a52574dbd381cbc453d77ba377c01d32e00 WatchSource:0}: Error finding container bffa9a40aa8113d627f683c345fb4a52574dbd381cbc453d77ba377c01d32e00: Status 404 returned error can't find the container with id bffa9a40aa8113d627f683c345fb4a52574dbd381cbc453d77ba377c01d32e00 Mar 20 13:30:01 crc kubenswrapper[4849]: I0320 13:30:01.042752 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 13:30:01 crc kubenswrapper[4849]: I0320 13:30:01.167572 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" event={"ID":"d5ad70d6-39b4-4488-99a2-34b33c249a5a","Type":"ContainerStarted","Data":"f850681d9be5fac0713f8963d5cf1efb4fd9a97ce3b9fe5e22c341b4a580ac41"} Mar 20 13:30:01 crc kubenswrapper[4849]: I0320 13:30:01.169859 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" event={"ID":"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6","Type":"ContainerStarted","Data":"ae409c2ab13fef38f3668e15a6ec99013b871357072dbb7bdfb9a3e0aa7b72af"} Mar 20 13:30:01 crc kubenswrapper[4849]: I0320 13:30:01.169918 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" event={"ID":"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6","Type":"ContainerStarted","Data":"bffa9a40aa8113d627f683c345fb4a52574dbd381cbc453d77ba377c01d32e00"} Mar 20 13:30:01 crc kubenswrapper[4849]: I0320 13:30:01.189325 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" podStartSLOduration=1.189305203 podStartE2EDuration="1.189305203s" podCreationTimestamp="2026-03-20 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:30:01.184405298 +0000 UTC m=+350.862128693" watchObservedRunningTime="2026-03-20 13:30:01.189305203 +0000 UTC m=+350.867028598" Mar 20 13:30:02 crc kubenswrapper[4849]: I0320 13:30:02.179683 4849 generic.go:334] "Generic (PLEG): container finished" podID="55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" containerID="ae409c2ab13fef38f3668e15a6ec99013b871357072dbb7bdfb9a3e0aa7b72af" exitCode=0 Mar 20 13:30:02 crc kubenswrapper[4849]: I0320 13:30:02.179722 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" event={"ID":"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6","Type":"ContainerDied","Data":"ae409c2ab13fef38f3668e15a6ec99013b871357072dbb7bdfb9a3e0aa7b72af"} Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.186348 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" event={"ID":"d5ad70d6-39b4-4488-99a2-34b33c249a5a","Type":"ContainerStarted","Data":"6d77d5825b322fe27bfce4d8d87e56d10e65d9f2d6805122b94bc71d4bd8b64a"} Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.205785 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" podStartSLOduration=1.194104845 podStartE2EDuration="3.205768538s" podCreationTimestamp="2026-03-20 13:30:00 +0000 UTC" firstStartedPulling="2026-03-20 13:30:00.904019873 +0000 UTC m=+350.581743268" lastFinishedPulling="2026-03-20 13:30:02.915683556 +0000 UTC m=+352.593406961" observedRunningTime="2026-03-20 13:30:03.201958013 +0000 UTC m=+352.879681428" watchObservedRunningTime="2026-03-20 13:30:03.205768538 +0000 UTC m=+352.883491933" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.451281 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.516593 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-secret-volume\") pod \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.516667 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xxjx\" (UniqueName: \"kubernetes.io/projected/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-kube-api-access-7xxjx\") pod \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.516711 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-config-volume\") pod \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\" (UID: \"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6\") " Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.517486 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-config-volume" (OuterVolumeSpecName: "config-volume") pod "55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" (UID: "55c18dd6-0aa5-4ead-84eb-830f2c08c2f6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.522364 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-kube-api-access-7xxjx" (OuterVolumeSpecName: "kube-api-access-7xxjx") pod "55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" (UID: "55c18dd6-0aa5-4ead-84eb-830f2c08c2f6"). InnerVolumeSpecName "kube-api-access-7xxjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.522475 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" (UID: "55c18dd6-0aa5-4ead-84eb-830f2c08c2f6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.618110 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xxjx\" (UniqueName: \"kubernetes.io/projected/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-kube-api-access-7xxjx\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.618155 4849 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4849]: I0320 13:30:03.618168 4849 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/55c18dd6-0aa5-4ead-84eb-830f2c08c2f6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:04 crc kubenswrapper[4849]: I0320 13:30:04.192900 4849 generic.go:334] "Generic (PLEG): container finished" podID="d5ad70d6-39b4-4488-99a2-34b33c249a5a" containerID="6d77d5825b322fe27bfce4d8d87e56d10e65d9f2d6805122b94bc71d4bd8b64a" exitCode=0 Mar 20 13:30:04 crc kubenswrapper[4849]: I0320 13:30:04.192989 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" event={"ID":"d5ad70d6-39b4-4488-99a2-34b33c249a5a","Type":"ContainerDied","Data":"6d77d5825b322fe27bfce4d8d87e56d10e65d9f2d6805122b94bc71d4bd8b64a"} Mar 20 13:30:04 crc kubenswrapper[4849]: I0320 13:30:04.194149 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" event={"ID":"55c18dd6-0aa5-4ead-84eb-830f2c08c2f6","Type":"ContainerDied","Data":"bffa9a40aa8113d627f683c345fb4a52574dbd381cbc453d77ba377c01d32e00"} Mar 20 13:30:04 crc kubenswrapper[4849]: I0320 13:30:04.194177 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bffa9a40aa8113d627f683c345fb4a52574dbd381cbc453d77ba377c01d32e00" Mar 20 13:30:04 crc kubenswrapper[4849]: I0320 13:30:04.194219 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-mhpld" Mar 20 13:30:05 crc kubenswrapper[4849]: I0320 13:30:05.451158 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:05 crc kubenswrapper[4849]: I0320 13:30:05.547275 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzqch\" (UniqueName: \"kubernetes.io/projected/d5ad70d6-39b4-4488-99a2-34b33c249a5a-kube-api-access-pzqch\") pod \"d5ad70d6-39b4-4488-99a2-34b33c249a5a\" (UID: \"d5ad70d6-39b4-4488-99a2-34b33c249a5a\") " Mar 20 13:30:05 crc kubenswrapper[4849]: I0320 13:30:05.552002 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ad70d6-39b4-4488-99a2-34b33c249a5a-kube-api-access-pzqch" (OuterVolumeSpecName: "kube-api-access-pzqch") pod "d5ad70d6-39b4-4488-99a2-34b33c249a5a" (UID: "d5ad70d6-39b4-4488-99a2-34b33c249a5a"). InnerVolumeSpecName "kube-api-access-pzqch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:05 crc kubenswrapper[4849]: I0320 13:30:05.649504 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzqch\" (UniqueName: \"kubernetes.io/projected/d5ad70d6-39b4-4488-99a2-34b33c249a5a-kube-api-access-pzqch\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:06 crc kubenswrapper[4849]: I0320 13:30:06.206903 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" event={"ID":"d5ad70d6-39b4-4488-99a2-34b33c249a5a","Type":"ContainerDied","Data":"f850681d9be5fac0713f8963d5cf1efb4fd9a97ce3b9fe5e22c341b4a580ac41"} Mar 20 13:30:06 crc kubenswrapper[4849]: I0320 13:30:06.206938 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-zrc5b" Mar 20 13:30:06 crc kubenswrapper[4849]: I0320 13:30:06.206950 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f850681d9be5fac0713f8963d5cf1efb4fd9a97ce3b9fe5e22c341b4a580ac41" Mar 20 13:30:15 crc kubenswrapper[4849]: I0320 13:30:15.260179 4849 generic.go:334] "Generic (PLEG): container finished" podID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerID="994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81" exitCode=0 Mar 20 13:30:15 crc kubenswrapper[4849]: I0320 13:30:15.260296 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" event={"ID":"b606bf18-c941-4fe2-9edf-8e4bf69bdc68","Type":"ContainerDied","Data":"994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81"} Mar 20 13:30:15 crc kubenswrapper[4849]: I0320 13:30:15.261678 4849 scope.go:117] "RemoveContainer" containerID="994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81" Mar 20 13:30:16 crc kubenswrapper[4849]: I0320 13:30:16.270714 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" event={"ID":"b606bf18-c941-4fe2-9edf-8e4bf69bdc68","Type":"ContainerStarted","Data":"df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a"} Mar 20 13:30:16 crc kubenswrapper[4849]: I0320 13:30:16.271476 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:30:16 crc kubenswrapper[4849]: I0320 13:30:16.274067 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:31:09 crc kubenswrapper[4849]: I0320 13:31:09.385000 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:09 crc kubenswrapper[4849]: I0320 13:31:09.385796 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.335297 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8gdkr"] Mar 20 13:31:19 crc kubenswrapper[4849]: E0320 13:31:19.336714 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ad70d6-39b4-4488-99a2-34b33c249a5a" containerName="oc" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.336736 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ad70d6-39b4-4488-99a2-34b33c249a5a" containerName="oc" Mar 20 13:31:19 crc kubenswrapper[4849]: E0320 13:31:19.336754 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" containerName="collect-profiles" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.336762 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" containerName="collect-profiles" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.336937 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c18dd6-0aa5-4ead-84eb-830f2c08c2f6" containerName="collect-profiles" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.336959 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ad70d6-39b4-4488-99a2-34b33c249a5a" containerName="oc" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.337795 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.357415 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8gdkr"] Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.498776 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49931f98-9bee-4f00-bae1-8d060e86fc95-registry-certificates\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.498901 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-registry-tls\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.498939 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-bound-sa-token\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.498980 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49931f98-9bee-4f00-bae1-8d060e86fc95-trusted-ca\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.499011 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.499040 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qxg6\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-kube-api-access-7qxg6\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.499058 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49931f98-9bee-4f00-bae1-8d060e86fc95-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.499075 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49931f98-9bee-4f00-bae1-8d060e86fc95-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.524294 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600462 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qxg6\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-kube-api-access-7qxg6\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600544 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49931f98-9bee-4f00-bae1-8d060e86fc95-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600572 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49931f98-9bee-4f00-bae1-8d060e86fc95-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600605 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49931f98-9bee-4f00-bae1-8d060e86fc95-registry-certificates\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600658 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-registry-tls\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600692 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-bound-sa-token\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.600721 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49931f98-9bee-4f00-bae1-8d060e86fc95-trusted-ca\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.601596 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49931f98-9bee-4f00-bae1-8d060e86fc95-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.602465 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49931f98-9bee-4f00-bae1-8d060e86fc95-registry-certificates\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.602520 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49931f98-9bee-4f00-bae1-8d060e86fc95-trusted-ca\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.608982 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-registry-tls\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.608996 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49931f98-9bee-4f00-bae1-8d060e86fc95-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.617719 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qxg6\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-kube-api-access-7qxg6\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.622118 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49931f98-9bee-4f00-bae1-8d060e86fc95-bound-sa-token\") pod \"image-registry-66df7c8f76-8gdkr\" (UID: \"49931f98-9bee-4f00-bae1-8d060e86fc95\") " pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:19 crc kubenswrapper[4849]: I0320 13:31:19.655459 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:20 crc kubenswrapper[4849]: I0320 13:31:20.109659 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8gdkr"] Mar 20 13:31:20 crc kubenswrapper[4849]: I0320 13:31:20.663831 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" event={"ID":"49931f98-9bee-4f00-bae1-8d060e86fc95","Type":"ContainerStarted","Data":"ac97d3c5217dac74c0058ca0569d1a0c4b159b800516905d2476efe851f27b02"} Mar 20 13:31:20 crc kubenswrapper[4849]: I0320 13:31:20.664214 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:20 crc kubenswrapper[4849]: I0320 13:31:20.664229 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" event={"ID":"49931f98-9bee-4f00-bae1-8d060e86fc95","Type":"ContainerStarted","Data":"bad6979025ef15af271e70f21218d02e5b78b083540444d318c39bfefdce9853"} Mar 20 13:31:20 crc kubenswrapper[4849]: I0320 13:31:20.686945 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" podStartSLOduration=1.686922939 podStartE2EDuration="1.686922939s" podCreationTimestamp="2026-03-20 13:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:31:20.682340171 +0000 UTC m=+430.360063576" watchObservedRunningTime="2026-03-20 13:31:20.686922939 +0000 UTC m=+430.364646334" Mar 20 13:31:27 crc kubenswrapper[4849]: I0320 13:31:27.882646 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2fkf"] Mar 20 13:31:27 crc kubenswrapper[4849]: I0320 13:31:27.883664 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v2fkf" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="registry-server" containerID="cri-o://5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609" gracePeriod=2 Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.263204 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.342912 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-catalog-content\") pod \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.343016 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-utilities\") pod \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.343094 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ndjl\" (UniqueName: \"kubernetes.io/projected/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-kube-api-access-4ndjl\") pod \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\" (UID: \"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10\") " Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.343749 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-utilities" (OuterVolumeSpecName: "utilities") pod "ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" (UID: "ae23d7db-4e32-4c07-ae0a-19dd8ac82a10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.350338 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-kube-api-access-4ndjl" (OuterVolumeSpecName: "kube-api-access-4ndjl") pod "ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" (UID: "ae23d7db-4e32-4c07-ae0a-19dd8ac82a10"). InnerVolumeSpecName "kube-api-access-4ndjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.444630 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.444671 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ndjl\" (UniqueName: \"kubernetes.io/projected/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-kube-api-access-4ndjl\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.483136 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" (UID: "ae23d7db-4e32-4c07-ae0a-19dd8ac82a10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.546128 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.710865 4849 generic.go:334] "Generic (PLEG): container finished" podID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerID="5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609" exitCode=0 Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.710908 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerDied","Data":"5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609"} Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.710918 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2fkf" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.710937 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2fkf" event={"ID":"ae23d7db-4e32-4c07-ae0a-19dd8ac82a10","Type":"ContainerDied","Data":"cad2328bff9ce2cc589ad2c72705bc9dae22179fa1562cdbdb002233959d9ae3"} Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.710977 4849 scope.go:117] "RemoveContainer" containerID="5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.732074 4849 scope.go:117] "RemoveContainer" containerID="a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.758926 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2fkf"] Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.762591 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v2fkf"] Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.764991 4849 scope.go:117] "RemoveContainer" containerID="8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.794442 4849 scope.go:117] "RemoveContainer" containerID="5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609" Mar 20 13:31:28 crc kubenswrapper[4849]: E0320 13:31:28.795035 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609\": container with ID starting with 5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609 not found: ID does not exist" containerID="5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.795141 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609"} err="failed to get container status \"5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609\": rpc error: code = NotFound desc = could not find container \"5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609\": container with ID starting with 5f18c26b515991b30c9638655f1e5d192e2ffe8abc8c5d18d38f08526b088609 not found: ID does not exist" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.795229 4849 scope.go:117] "RemoveContainer" containerID="a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b" Mar 20 13:31:28 crc kubenswrapper[4849]: E0320 13:31:28.795569 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b\": container with ID starting with a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b not found: ID does not exist" containerID="a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.795593 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b"} err="failed to get container status \"a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b\": rpc error: code = NotFound desc = could not find container \"a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b\": container with ID starting with a7f674cde346247a5ef2a16c554d999c7cf3447c614859ef7b285bcc1813385b not found: ID does not exist" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.795608 4849 scope.go:117] "RemoveContainer" containerID="8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0" Mar 20 13:31:28 crc kubenswrapper[4849]: E0320 13:31:28.795813 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0\": container with ID starting with 8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0 not found: ID does not exist" containerID="8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0" Mar 20 13:31:28 crc kubenswrapper[4849]: I0320 13:31:28.795924 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0"} err="failed to get container status \"8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0\": rpc error: code = NotFound desc = could not find container \"8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0\": container with ID starting with 8411086cad242858ebb106df2cf59bd1ccca029556c2d1c376e61e2514e034a0 not found: ID does not exist" Mar 20 13:31:29 crc kubenswrapper[4849]: I0320 13:31:29.042112 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" path="/var/lib/kubelet/pods/ae23d7db-4e32-4c07-ae0a-19dd8ac82a10/volumes" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.031296 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft4dw"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.032196 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ft4dw" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="registry-server" containerID="cri-o://64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c" gracePeriod=30 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.042728 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xx4fv"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.043247 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xx4fv" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="registry-server" containerID="cri-o://613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4" gracePeriod=30 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.053920 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v8tw5"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.054142 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" containerID="cri-o://df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a" gracePeriod=30 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.080052 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgk97"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.080383 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dgk97" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="registry-server" containerID="cri-o://072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca" gracePeriod=30 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.081218 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fhpw4"] Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.081680 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="extract-utilities" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.081704 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="extract-utilities" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.081733 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="registry-server" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.081769 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="registry-server" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.081788 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="extract-content" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.081800 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="extract-content" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.081975 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae23d7db-4e32-4c07-ae0a-19dd8ac82a10" containerName="registry-server" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.082549 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.084528 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fhpw4"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.087365 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lnk65"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.087606 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lnk65" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="registry-server" containerID="cri-o://de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f" gracePeriod=30 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.169518 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ead5a591-d201-4f88-8357-d2c8d3ceb93e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.169633 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ead5a591-d201-4f88-8357-d2c8d3ceb93e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.169712 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4llr\" (UniqueName: \"kubernetes.io/projected/ead5a591-d201-4f88-8357-d2c8d3ceb93e-kube-api-access-w4llr\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.271556 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4llr\" (UniqueName: \"kubernetes.io/projected/ead5a591-d201-4f88-8357-d2c8d3ceb93e-kube-api-access-w4llr\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.271638 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ead5a591-d201-4f88-8357-d2c8d3ceb93e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.271694 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ead5a591-d201-4f88-8357-d2c8d3ceb93e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.296034 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ead5a591-d201-4f88-8357-d2c8d3ceb93e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.296189 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4llr\" (UniqueName: \"kubernetes.io/projected/ead5a591-d201-4f88-8357-d2c8d3ceb93e-kube-api-access-w4llr\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.297264 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ead5a591-d201-4f88-8357-d2c8d3ceb93e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fhpw4\" (UID: \"ead5a591-d201-4f88-8357-d2c8d3ceb93e\") " pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.536580 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.540392 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.553387 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.558712 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.571509 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575486 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-catalog-content\") pod \"63553d28-5dba-492e-b004-043ea30ee635\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575562 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-utilities\") pod \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575614 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdlp7\" (UniqueName: \"kubernetes.io/projected/63553d28-5dba-492e-b004-043ea30ee635-kube-api-access-qdlp7\") pod \"63553d28-5dba-492e-b004-043ea30ee635\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575654 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkxh8\" (UniqueName: \"kubernetes.io/projected/b7396166-d1a2-4565-8ccc-3ed06ce215f4-kube-api-access-hkxh8\") pod \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575688 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-utilities\") pod \"63553d28-5dba-492e-b004-043ea30ee635\" (UID: \"63553d28-5dba-492e-b004-043ea30ee635\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575712 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-catalog-content\") pod \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\" (UID: \"b7396166-d1a2-4565-8ccc-3ed06ce215f4\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575737 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-utilities\") pod \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575797 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-catalog-content\") pod \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.575878 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvpp\" (UniqueName: \"kubernetes.io/projected/b7e8bcae-39ef-4786-b2b8-18dea74380fa-kube-api-access-9vvpp\") pod \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\" (UID: \"b7e8bcae-39ef-4786-b2b8-18dea74380fa\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.581227 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-utilities" (OuterVolumeSpecName: "utilities") pod "63553d28-5dba-492e-b004-043ea30ee635" (UID: "63553d28-5dba-492e-b004-043ea30ee635"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.584282 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-utilities" (OuterVolumeSpecName: "utilities") pod "b7e8bcae-39ef-4786-b2b8-18dea74380fa" (UID: "b7e8bcae-39ef-4786-b2b8-18dea74380fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.584902 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7396166-d1a2-4565-8ccc-3ed06ce215f4-kube-api-access-hkxh8" (OuterVolumeSpecName: "kube-api-access-hkxh8") pod "b7396166-d1a2-4565-8ccc-3ed06ce215f4" (UID: "b7396166-d1a2-4565-8ccc-3ed06ce215f4"). InnerVolumeSpecName "kube-api-access-hkxh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.591196 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e8bcae-39ef-4786-b2b8-18dea74380fa-kube-api-access-9vvpp" (OuterVolumeSpecName: "kube-api-access-9vvpp") pod "b7e8bcae-39ef-4786-b2b8-18dea74380fa" (UID: "b7e8bcae-39ef-4786-b2b8-18dea74380fa"). InnerVolumeSpecName "kube-api-access-9vvpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.592997 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-utilities" (OuterVolumeSpecName: "utilities") pod "b7396166-d1a2-4565-8ccc-3ed06ce215f4" (UID: "b7396166-d1a2-4565-8ccc-3ed06ce215f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.594230 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63553d28-5dba-492e-b004-043ea30ee635-kube-api-access-qdlp7" (OuterVolumeSpecName: "kube-api-access-qdlp7") pod "63553d28-5dba-492e-b004-043ea30ee635" (UID: "63553d28-5dba-492e-b004-043ea30ee635"). InnerVolumeSpecName "kube-api-access-qdlp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.601591 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vvpp\" (UniqueName: \"kubernetes.io/projected/b7e8bcae-39ef-4786-b2b8-18dea74380fa-kube-api-access-9vvpp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.601678 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.601698 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdlp7\" (UniqueName: \"kubernetes.io/projected/63553d28-5dba-492e-b004-043ea30ee635-kube-api-access-qdlp7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.601711 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkxh8\" (UniqueName: \"kubernetes.io/projected/b7396166-d1a2-4565-8ccc-3ed06ce215f4-kube-api-access-hkxh8\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.601725 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.601739 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.628886 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7e8bcae-39ef-4786-b2b8-18dea74380fa" (UID: "b7e8bcae-39ef-4786-b2b8-18dea74380fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.629591 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.672233 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7396166-d1a2-4565-8ccc-3ed06ce215f4" (UID: "b7396166-d1a2-4565-8ccc-3ed06ce215f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.687738 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63553d28-5dba-492e-b004-043ea30ee635" (UID: "63553d28-5dba-492e-b004-043ea30ee635"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702303 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-catalog-content\") pod \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702366 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-utilities\") pod \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702408 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-trusted-ca\") pod \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702483 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-operator-metrics\") pod \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702519 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6nc9\" (UniqueName: \"kubernetes.io/projected/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-kube-api-access-j6nc9\") pod \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\" (UID: \"b606bf18-c941-4fe2-9edf-8e4bf69bdc68\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702540 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm282\" (UniqueName: \"kubernetes.io/projected/02c87e15-4f0c-422f-812b-5a4bcbf1b639-kube-api-access-xm282\") pod \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\" (UID: \"02c87e15-4f0c-422f-812b-5a4bcbf1b639\") " Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702796 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63553d28-5dba-492e-b004-043ea30ee635-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702828 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7396166-d1a2-4565-8ccc-3ed06ce215f4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.702838 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7e8bcae-39ef-4786-b2b8-18dea74380fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.705411 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c87e15-4f0c-422f-812b-5a4bcbf1b639-kube-api-access-xm282" (OuterVolumeSpecName: "kube-api-access-xm282") pod "02c87e15-4f0c-422f-812b-5a4bcbf1b639" (UID: "02c87e15-4f0c-422f-812b-5a4bcbf1b639"). InnerVolumeSpecName "kube-api-access-xm282". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.705748 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b606bf18-c941-4fe2-9edf-8e4bf69bdc68" (UID: "b606bf18-c941-4fe2-9edf-8e4bf69bdc68"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.706257 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-utilities" (OuterVolumeSpecName: "utilities") pod "02c87e15-4f0c-422f-812b-5a4bcbf1b639" (UID: "02c87e15-4f0c-422f-812b-5a4bcbf1b639"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.708924 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b606bf18-c941-4fe2-9edf-8e4bf69bdc68" (UID: "b606bf18-c941-4fe2-9edf-8e4bf69bdc68"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.708982 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-kube-api-access-j6nc9" (OuterVolumeSpecName: "kube-api-access-j6nc9") pod "b606bf18-c941-4fe2-9edf-8e4bf69bdc68" (UID: "b606bf18-c941-4fe2-9edf-8e4bf69bdc68"). InnerVolumeSpecName "kube-api-access-j6nc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.756419 4849 generic.go:334] "Generic (PLEG): container finished" podID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerID="072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca" exitCode=0 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.756508 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerDied","Data":"072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.756542 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dgk97" event={"ID":"b7e8bcae-39ef-4786-b2b8-18dea74380fa","Type":"ContainerDied","Data":"6dbb96491077bcbb88392d125176c9ccbd562b7dbc3bff56a02f54a474c87d93"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.756538 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dgk97" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.756573 4849 scope.go:117] "RemoveContainer" containerID="072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.761780 4849 generic.go:334] "Generic (PLEG): container finished" podID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerID="de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f" exitCode=0 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.761936 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lnk65" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.762127 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerDied","Data":"de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.762161 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lnk65" event={"ID":"02c87e15-4f0c-422f-812b-5a4bcbf1b639","Type":"ContainerDied","Data":"97d757430a5f795d6d8571640532081ba58e5efb58d48dd62d9343e6e390f7d3"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.765750 4849 generic.go:334] "Generic (PLEG): container finished" podID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerID="df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a" exitCode=0 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.765812 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" event={"ID":"b606bf18-c941-4fe2-9edf-8e4bf69bdc68","Type":"ContainerDied","Data":"df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.765856 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" event={"ID":"b606bf18-c941-4fe2-9edf-8e4bf69bdc68","Type":"ContainerDied","Data":"efaa028cac3913fed2dbf5c65cafa88c82304b4dfb2d9aded181af8972900d4e"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.765937 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-v8tw5" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.767793 4849 generic.go:334] "Generic (PLEG): container finished" podID="63553d28-5dba-492e-b004-043ea30ee635" containerID="613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4" exitCode=0 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.767886 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerDied","Data":"613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.767917 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xx4fv" event={"ID":"63553d28-5dba-492e-b004-043ea30ee635","Type":"ContainerDied","Data":"3d5483b02561a2783925fa8b6014a8301fe5c0d2f7c8ea86768f263f8c85113f"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.767983 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xx4fv" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.780784 4849 generic.go:334] "Generic (PLEG): container finished" podID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerID="64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c" exitCode=0 Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.780833 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft4dw" event={"ID":"b7396166-d1a2-4565-8ccc-3ed06ce215f4","Type":"ContainerDied","Data":"64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.780858 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ft4dw" event={"ID":"b7396166-d1a2-4565-8ccc-3ed06ce215f4","Type":"ContainerDied","Data":"e77fc277cfd42e3e90d30e1a55f30ddbc19bd108f9441c54a6ab5aff35b60d9a"} Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.780934 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ft4dw" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.781601 4849 scope.go:117] "RemoveContainer" containerID="10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.804744 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6nc9\" (UniqueName: \"kubernetes.io/projected/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-kube-api-access-j6nc9\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.804774 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm282\" (UniqueName: \"kubernetes.io/projected/02c87e15-4f0c-422f-812b-5a4bcbf1b639-kube-api-access-xm282\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.804785 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.804796 4849 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.804805 4849 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b606bf18-c941-4fe2-9edf-8e4bf69bdc68-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.809643 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v8tw5"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.811290 4849 scope.go:117] "RemoveContainer" containerID="2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.813596 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-v8tw5"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.821919 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ft4dw"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.846467 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ft4dw"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.857066 4849 scope.go:117] "RemoveContainer" containerID="072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.859294 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca\": container with ID starting with 072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca not found: ID does not exist" containerID="072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.859355 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca"} err="failed to get container status \"072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca\": rpc error: code = NotFound desc = could not find container \"072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca\": container with ID starting with 072c457c3893c1a8e7ba566e0db34e87c8a4e875c5ac36a5cf6a31773a1751ca not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.859393 4849 scope.go:117] "RemoveContainer" containerID="10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.860005 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b\": container with ID starting with 10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b not found: ID does not exist" containerID="10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.860044 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b"} err="failed to get container status \"10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b\": rpc error: code = NotFound desc = could not find container \"10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b\": container with ID starting with 10e6a2706aaf48f19def9b4ad10abbb5a921eecd1d5e4f0aa33bb53a838eda1b not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.860072 4849 scope.go:117] "RemoveContainer" containerID="2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.861081 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xx4fv"] Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.863735 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f\": container with ID starting with 2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f not found: ID does not exist" containerID="2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.863767 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f"} err="failed to get container status \"2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f\": rpc error: code = NotFound desc = could not find container \"2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f\": container with ID starting with 2499f45dc61b8ff129c4a2c83a93d192f9c4f9fe3e574628766b3c99dea9524f not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.863788 4849 scope.go:117] "RemoveContainer" containerID="de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.869593 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xx4fv"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.873539 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgk97"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.874420 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02c87e15-4f0c-422f-812b-5a4bcbf1b639" (UID: "02c87e15-4f0c-422f-812b-5a4bcbf1b639"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.877034 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dgk97"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.886371 4849 scope.go:117] "RemoveContainer" containerID="59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.906095 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c87e15-4f0c-422f-812b-5a4bcbf1b639-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.906868 4849 scope.go:117] "RemoveContainer" containerID="805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.925216 4849 scope.go:117] "RemoveContainer" containerID="de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.925792 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f\": container with ID starting with de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f not found: ID does not exist" containerID="de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.925877 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f"} err="failed to get container status \"de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f\": rpc error: code = NotFound desc = could not find container \"de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f\": container with ID starting with de9bc27fb4e665c65f36111b8a9ab917d8d826373ec1b046fd1a8b989d5ee61f not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.925921 4849 scope.go:117] "RemoveContainer" containerID="59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.926670 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8\": container with ID starting with 59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8 not found: ID does not exist" containerID="59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.926900 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8"} err="failed to get container status \"59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8\": rpc error: code = NotFound desc = could not find container \"59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8\": container with ID starting with 59ff89546bae8a0b03e28f08c28cb0149bbda2c7d74ce38b28a2dc0b212808d8 not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.927030 4849 scope.go:117] "RemoveContainer" containerID="805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.927647 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1\": container with ID starting with 805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1 not found: ID does not exist" containerID="805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.927689 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1"} err="failed to get container status \"805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1\": rpc error: code = NotFound desc = could not find container \"805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1\": container with ID starting with 805d99b5f1aad788f94caea548df724fffe4aa4b9075f6c05f15faa7418072d1 not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.927717 4849 scope.go:117] "RemoveContainer" containerID="df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.944342 4849 scope.go:117] "RemoveContainer" containerID="994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.966502 4849 scope.go:117] "RemoveContainer" containerID="df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.971055 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a\": container with ID starting with df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a not found: ID does not exist" containerID="df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.971110 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a"} err="failed to get container status \"df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a\": rpc error: code = NotFound desc = could not find container \"df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a\": container with ID starting with df4a55304002c95a7eac6025f7f381089458da72656ff73183762b7f4049180a not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.971154 4849 scope.go:117] "RemoveContainer" containerID="994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81" Mar 20 13:31:35 crc kubenswrapper[4849]: E0320 13:31:35.971557 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81\": container with ID starting with 994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81 not found: ID does not exist" containerID="994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.971582 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81"} err="failed to get container status \"994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81\": rpc error: code = NotFound desc = could not find container \"994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81\": container with ID starting with 994542192d608c68cf1a0e5de227880d3aa0bd3ebb90c38163615a538b51fe81 not found: ID does not exist" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.971596 4849 scope.go:117] "RemoveContainer" containerID="613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4" Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.987490 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fhpw4"] Mar 20 13:31:35 crc kubenswrapper[4849]: I0320 13:31:35.990251 4849 scope.go:117] "RemoveContainer" containerID="0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab" Mar 20 13:31:35 crc kubenswrapper[4849]: W0320 13:31:35.995187 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead5a591_d201_4f88_8357_d2c8d3ceb93e.slice/crio-0182e0e81d84eccd186d19a91474f9178e26aabc9d4ac41a446bff48c6379dc4 WatchSource:0}: Error finding container 0182e0e81d84eccd186d19a91474f9178e26aabc9d4ac41a446bff48c6379dc4: Status 404 returned error can't find the container with id 0182e0e81d84eccd186d19a91474f9178e26aabc9d4ac41a446bff48c6379dc4 Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.010095 4849 scope.go:117] "RemoveContainer" containerID="02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.031119 4849 scope.go:117] "RemoveContainer" containerID="613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4" Mar 20 13:31:36 crc kubenswrapper[4849]: E0320 13:31:36.031677 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4\": container with ID starting with 613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4 not found: ID does not exist" containerID="613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.031720 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4"} err="failed to get container status \"613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4\": rpc error: code = NotFound desc = could not find container \"613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4\": container with ID starting with 613b521043d88adca05fdf9d6ebf1208efbfba43dda6c9d13a12efcb291191e4 not found: ID does not exist" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.031744 4849 scope.go:117] "RemoveContainer" containerID="0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab" Mar 20 13:31:36 crc kubenswrapper[4849]: E0320 13:31:36.032455 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab\": container with ID starting with 0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab not found: ID does not exist" containerID="0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.032482 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab"} err="failed to get container status \"0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab\": rpc error: code = NotFound desc = could not find container \"0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab\": container with ID starting with 0c9728a580ab6e766797b6430ec200d4c8d0ea15b678ef34ea372e4ae963b2ab not found: ID does not exist" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.032504 4849 scope.go:117] "RemoveContainer" containerID="02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a" Mar 20 13:31:36 crc kubenswrapper[4849]: E0320 13:31:36.032810 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a\": container with ID starting with 02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a not found: ID does not exist" containerID="02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.032851 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a"} err="failed to get container status \"02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a\": rpc error: code = NotFound desc = could not find container \"02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a\": container with ID starting with 02733cb4953f2222d4165721a4666e99506f00000704d20b0602c3f470a7348a not found: ID does not exist" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.032871 4849 scope.go:117] "RemoveContainer" containerID="64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.057829 4849 scope.go:117] "RemoveContainer" containerID="809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.078635 4849 scope.go:117] "RemoveContainer" containerID="b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.101981 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lnk65"] Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.104792 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lnk65"] Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.122349 4849 scope.go:117] "RemoveContainer" containerID="64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c" Mar 20 13:31:36 crc kubenswrapper[4849]: E0320 13:31:36.122714 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c\": container with ID starting with 64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c not found: ID does not exist" containerID="64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.122746 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c"} err="failed to get container status \"64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c\": rpc error: code = NotFound desc = could not find container \"64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c\": container with ID starting with 64925bc7d92d38f646d74c48018c1b2d5cc03c6edfb0e4cc2579995e9dda670c not found: ID does not exist" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.122771 4849 scope.go:117] "RemoveContainer" containerID="809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2" Mar 20 13:31:36 crc kubenswrapper[4849]: E0320 13:31:36.123071 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2\": container with ID starting with 809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2 not found: ID does not exist" containerID="809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.123092 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2"} err="failed to get container status \"809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2\": rpc error: code = NotFound desc = could not find container \"809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2\": container with ID starting with 809a4c0780e458b509b2a39b5446ee3cffacde4833634bbc565d6e32949d8df2 not found: ID does not exist" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.123105 4849 scope.go:117] "RemoveContainer" containerID="b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff" Mar 20 13:31:36 crc kubenswrapper[4849]: E0320 13:31:36.123284 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff\": container with ID starting with b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff not found: ID does not exist" containerID="b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.123303 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff"} err="failed to get container status \"b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff\": rpc error: code = NotFound desc = could not find container \"b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff\": container with ID starting with b08618320e1ea3bb40bd3516cd7f8b5ae9c697e9366a76abd6bd9eea77b16aff not found: ID does not exist" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.791625 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" event={"ID":"ead5a591-d201-4f88-8357-d2c8d3ceb93e","Type":"ContainerStarted","Data":"7d1ffc0ce80f4c158da8d5ac1ea9969ab159688e84cac8031a2420d50092cece"} Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.791685 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" event={"ID":"ead5a591-d201-4f88-8357-d2c8d3ceb93e","Type":"ContainerStarted","Data":"0182e0e81d84eccd186d19a91474f9178e26aabc9d4ac41a446bff48c6379dc4"} Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.792026 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.796532 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" Mar 20 13:31:36 crc kubenswrapper[4849]: I0320 13:31:36.810202 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fhpw4" podStartSLOduration=1.810181311 podStartE2EDuration="1.810181311s" podCreationTimestamp="2026-03-20 13:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:31:36.808646998 +0000 UTC m=+446.486370413" watchObservedRunningTime="2026-03-20 13:31:36.810181311 +0000 UTC m=+446.487904706" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.042786 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" path="/var/lib/kubelet/pods/02c87e15-4f0c-422f-812b-5a4bcbf1b639/volumes" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.043464 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63553d28-5dba-492e-b004-043ea30ee635" path="/var/lib/kubelet/pods/63553d28-5dba-492e-b004-043ea30ee635/volumes" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.044231 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" path="/var/lib/kubelet/pods/b606bf18-c941-4fe2-9edf-8e4bf69bdc68/volumes" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.045173 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" path="/var/lib/kubelet/pods/b7396166-d1a2-4565-8ccc-3ed06ce215f4/volumes" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.045722 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" path="/var/lib/kubelet/pods/b7e8bcae-39ef-4786-b2b8-18dea74380fa/volumes" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242381 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zvvvr"] Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242582 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242596 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242606 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242612 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242643 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242650 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242658 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242666 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242674 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242681 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242691 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242698 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242710 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242717 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242729 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242737 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="extract-content" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242747 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242754 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242766 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242773 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242784 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242791 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242804 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242812 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="extract-utilities" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.242838 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242844 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242961 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c87e15-4f0c-422f-812b-5a4bcbf1b639" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242980 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.242994 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="63553d28-5dba-492e-b004-043ea30ee635" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.243007 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.243024 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7396166-d1a2-4565-8ccc-3ed06ce215f4" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.243036 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e8bcae-39ef-4786-b2b8-18dea74380fa" containerName="registry-server" Mar 20 13:31:37 crc kubenswrapper[4849]: E0320 13:31:37.243143 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.243155 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b606bf18-c941-4fe2-9edf-8e4bf69bdc68" containerName="marketplace-operator" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.243934 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.250763 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.256203 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvvvr"] Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.325768 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036f1af8-83ae-4a96-b192-8349a3f78e16-utilities\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.325864 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df4jh\" (UniqueName: \"kubernetes.io/projected/036f1af8-83ae-4a96-b192-8349a3f78e16-kube-api-access-df4jh\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.325904 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036f1af8-83ae-4a96-b192-8349a3f78e16-catalog-content\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.427387 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036f1af8-83ae-4a96-b192-8349a3f78e16-utilities\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.428246 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036f1af8-83ae-4a96-b192-8349a3f78e16-utilities\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.427468 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df4jh\" (UniqueName: \"kubernetes.io/projected/036f1af8-83ae-4a96-b192-8349a3f78e16-kube-api-access-df4jh\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.428318 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036f1af8-83ae-4a96-b192-8349a3f78e16-catalog-content\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.428687 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036f1af8-83ae-4a96-b192-8349a3f78e16-catalog-content\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.438088 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hkn9l"] Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.452355 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hkn9l"] Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.452530 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.456503 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.480631 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df4jh\" (UniqueName: \"kubernetes.io/projected/036f1af8-83ae-4a96-b192-8349a3f78e16-kube-api-access-df4jh\") pod \"certified-operators-zvvvr\" (UID: \"036f1af8-83ae-4a96-b192-8349a3f78e16\") " pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.529414 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbb658a-6c29-4a47-be7c-37aaade8f494-utilities\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.529496 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88x9\" (UniqueName: \"kubernetes.io/projected/dfbb658a-6c29-4a47-be7c-37aaade8f494-kube-api-access-q88x9\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.529528 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbb658a-6c29-4a47-be7c-37aaade8f494-catalog-content\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.569312 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.632232 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbb658a-6c29-4a47-be7c-37aaade8f494-utilities\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.632394 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88x9\" (UniqueName: \"kubernetes.io/projected/dfbb658a-6c29-4a47-be7c-37aaade8f494-kube-api-access-q88x9\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.632450 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbb658a-6c29-4a47-be7c-37aaade8f494-catalog-content\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.632932 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfbb658a-6c29-4a47-be7c-37aaade8f494-utilities\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.635294 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfbb658a-6c29-4a47-be7c-37aaade8f494-catalog-content\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.651697 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88x9\" (UniqueName: \"kubernetes.io/projected/dfbb658a-6c29-4a47-be7c-37aaade8f494-kube-api-access-q88x9\") pod \"community-operators-hkn9l\" (UID: \"dfbb658a-6c29-4a47-be7c-37aaade8f494\") " pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.793835 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.972023 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hkn9l"] Mar 20 13:31:37 crc kubenswrapper[4849]: W0320 13:31:37.984162 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfbb658a_6c29_4a47_be7c_37aaade8f494.slice/crio-d4d3b4341ce399c38c49e61da86f7f11668939a0ec382e722cb87fad21bb35ee WatchSource:0}: Error finding container d4d3b4341ce399c38c49e61da86f7f11668939a0ec382e722cb87fad21bb35ee: Status 404 returned error can't find the container with id d4d3b4341ce399c38c49e61da86f7f11668939a0ec382e722cb87fad21bb35ee Mar 20 13:31:37 crc kubenswrapper[4849]: I0320 13:31:37.994566 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zvvvr"] Mar 20 13:31:38 crc kubenswrapper[4849]: W0320 13:31:38.008754 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036f1af8_83ae_4a96_b192_8349a3f78e16.slice/crio-0abb2a3a8cab147f8bb5b97435b79bb58833fa0691c4c6cb464fc40634a43659 WatchSource:0}: Error finding container 0abb2a3a8cab147f8bb5b97435b79bb58833fa0691c4c6cb464fc40634a43659: Status 404 returned error can't find the container with id 0abb2a3a8cab147f8bb5b97435b79bb58833fa0691c4c6cb464fc40634a43659 Mar 20 13:31:38 crc kubenswrapper[4849]: I0320 13:31:38.825490 4849 generic.go:334] "Generic (PLEG): container finished" podID="dfbb658a-6c29-4a47-be7c-37aaade8f494" containerID="cf2fbc3d2cbdc6653d9c7b33d7cf581dc875b42f9c5697784bf9f685510469fe" exitCode=0 Mar 20 13:31:38 crc kubenswrapper[4849]: I0320 13:31:38.825584 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkn9l" event={"ID":"dfbb658a-6c29-4a47-be7c-37aaade8f494","Type":"ContainerDied","Data":"cf2fbc3d2cbdc6653d9c7b33d7cf581dc875b42f9c5697784bf9f685510469fe"} Mar 20 13:31:38 crc kubenswrapper[4849]: I0320 13:31:38.825619 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkn9l" event={"ID":"dfbb658a-6c29-4a47-be7c-37aaade8f494","Type":"ContainerStarted","Data":"d4d3b4341ce399c38c49e61da86f7f11668939a0ec382e722cb87fad21bb35ee"} Mar 20 13:31:38 crc kubenswrapper[4849]: I0320 13:31:38.827894 4849 generic.go:334] "Generic (PLEG): container finished" podID="036f1af8-83ae-4a96-b192-8349a3f78e16" containerID="e33262043eca4f7ec33a8d726c3567ca890580ce661532ad0a0876fb0f63ee4a" exitCode=0 Mar 20 13:31:38 crc kubenswrapper[4849]: I0320 13:31:38.827976 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvvvr" event={"ID":"036f1af8-83ae-4a96-b192-8349a3f78e16","Type":"ContainerDied","Data":"e33262043eca4f7ec33a8d726c3567ca890580ce661532ad0a0876fb0f63ee4a"} Mar 20 13:31:38 crc kubenswrapper[4849]: I0320 13:31:38.828003 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvvvr" event={"ID":"036f1af8-83ae-4a96-b192-8349a3f78e16","Type":"ContainerStarted","Data":"0abb2a3a8cab147f8bb5b97435b79bb58833fa0691c4c6cb464fc40634a43659"} Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.044226 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h596s"] Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.045224 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.048502 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.064998 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h596s"] Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.155041 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00f0000c-6337-492e-928d-047fcbf4dfc5-utilities\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.155150 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m6ml\" (UniqueName: \"kubernetes.io/projected/00f0000c-6337-492e-928d-047fcbf4dfc5-kube-api-access-9m6ml\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.155397 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00f0000c-6337-492e-928d-047fcbf4dfc5-catalog-content\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.257288 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00f0000c-6337-492e-928d-047fcbf4dfc5-catalog-content\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.257800 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00f0000c-6337-492e-928d-047fcbf4dfc5-catalog-content\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.257959 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00f0000c-6337-492e-928d-047fcbf4dfc5-utilities\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.258072 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m6ml\" (UniqueName: \"kubernetes.io/projected/00f0000c-6337-492e-928d-047fcbf4dfc5-kube-api-access-9m6ml\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.258591 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00f0000c-6337-492e-928d-047fcbf4dfc5-utilities\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.275317 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m6ml\" (UniqueName: \"kubernetes.io/projected/00f0000c-6337-492e-928d-047fcbf4dfc5-kube-api-access-9m6ml\") pod \"redhat-marketplace-h596s\" (UID: \"00f0000c-6337-492e-928d-047fcbf4dfc5\") " pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.384530 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.384603 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:31:39 crc kubenswrapper[4849]: I0320 13:31:39.422068 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:39.662572 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8gdkr" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:39.728035 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ttnt5"] Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:39.841534 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvvvr" event={"ID":"036f1af8-83ae-4a96-b192-8349a3f78e16","Type":"ContainerStarted","Data":"1e6919061b40fd6a28c1f8df4f31c0b18c66b8b46bc2c35ca5ab8de7409061d0"} Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:39.847768 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkn9l" event={"ID":"dfbb658a-6c29-4a47-be7c-37aaade8f494","Type":"ContainerStarted","Data":"9fb907f124d263a97c1f72950c1bdd67a23f992d0849898e891bc9edbf557b2d"} Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.436375 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-865zh"] Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.437774 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.447107 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.460718 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-865zh"] Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.473948 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95576495-a434-415a-98c3-714268b1d0c1-catalog-content\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.473995 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5gxh\" (UniqueName: \"kubernetes.io/projected/95576495-a434-415a-98c3-714268b1d0c1-kube-api-access-n5gxh\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.474034 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95576495-a434-415a-98c3-714268b1d0c1-utilities\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.505645 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h596s"] Mar 20 13:31:40 crc kubenswrapper[4849]: W0320 13:31:40.515353 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00f0000c_6337_492e_928d_047fcbf4dfc5.slice/crio-aba3fec1102aa48cbfe89e963936cc5c478f79d30297d4c1127ebcc3338ff5e6 WatchSource:0}: Error finding container aba3fec1102aa48cbfe89e963936cc5c478f79d30297d4c1127ebcc3338ff5e6: Status 404 returned error can't find the container with id aba3fec1102aa48cbfe89e963936cc5c478f79d30297d4c1127ebcc3338ff5e6 Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.575643 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95576495-a434-415a-98c3-714268b1d0c1-utilities\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.575785 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95576495-a434-415a-98c3-714268b1d0c1-catalog-content\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.575847 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5gxh\" (UniqueName: \"kubernetes.io/projected/95576495-a434-415a-98c3-714268b1d0c1-kube-api-access-n5gxh\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.576206 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95576495-a434-415a-98c3-714268b1d0c1-utilities\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.576294 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95576495-a434-415a-98c3-714268b1d0c1-catalog-content\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.597737 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5gxh\" (UniqueName: \"kubernetes.io/projected/95576495-a434-415a-98c3-714268b1d0c1-kube-api-access-n5gxh\") pod \"redhat-operators-865zh\" (UID: \"95576495-a434-415a-98c3-714268b1d0c1\") " pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.759445 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.863324 4849 generic.go:334] "Generic (PLEG): container finished" podID="dfbb658a-6c29-4a47-be7c-37aaade8f494" containerID="9fb907f124d263a97c1f72950c1bdd67a23f992d0849898e891bc9edbf557b2d" exitCode=0 Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.863512 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkn9l" event={"ID":"dfbb658a-6c29-4a47-be7c-37aaade8f494","Type":"ContainerDied","Data":"9fb907f124d263a97c1f72950c1bdd67a23f992d0849898e891bc9edbf557b2d"} Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.872437 4849 generic.go:334] "Generic (PLEG): container finished" podID="00f0000c-6337-492e-928d-047fcbf4dfc5" containerID="979005ed05ab38ff9047e174eff7d38c59b178b6804e4620f634b345719ad408" exitCode=0 Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.872644 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h596s" event={"ID":"00f0000c-6337-492e-928d-047fcbf4dfc5","Type":"ContainerDied","Data":"979005ed05ab38ff9047e174eff7d38c59b178b6804e4620f634b345719ad408"} Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.872679 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h596s" event={"ID":"00f0000c-6337-492e-928d-047fcbf4dfc5","Type":"ContainerStarted","Data":"aba3fec1102aa48cbfe89e963936cc5c478f79d30297d4c1127ebcc3338ff5e6"} Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.878306 4849 generic.go:334] "Generic (PLEG): container finished" podID="036f1af8-83ae-4a96-b192-8349a3f78e16" containerID="1e6919061b40fd6a28c1f8df4f31c0b18c66b8b46bc2c35ca5ab8de7409061d0" exitCode=0 Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.878404 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvvvr" event={"ID":"036f1af8-83ae-4a96-b192-8349a3f78e16","Type":"ContainerDied","Data":"1e6919061b40fd6a28c1f8df4f31c0b18c66b8b46bc2c35ca5ab8de7409061d0"} Mar 20 13:31:40 crc kubenswrapper[4849]: I0320 13:31:40.994211 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-865zh"] Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.887895 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zvvvr" event={"ID":"036f1af8-83ae-4a96-b192-8349a3f78e16","Type":"ContainerStarted","Data":"9fd2c5aa5545c5ee334570c0b8d0f621554508d1bebb662e81f47cec22d619c3"} Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.891448 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hkn9l" event={"ID":"dfbb658a-6c29-4a47-be7c-37aaade8f494","Type":"ContainerStarted","Data":"1b192544a33bdd61f1415c4a562704ff4eb2ec813831e77ae5b4b6567a8b93a8"} Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.894352 4849 generic.go:334] "Generic (PLEG): container finished" podID="95576495-a434-415a-98c3-714268b1d0c1" containerID="3abb0162cef43c78dfc86441958ed2952f451d2945d5b40151f80855d1d57c1d" exitCode=0 Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.894387 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865zh" event={"ID":"95576495-a434-415a-98c3-714268b1d0c1","Type":"ContainerDied","Data":"3abb0162cef43c78dfc86441958ed2952f451d2945d5b40151f80855d1d57c1d"} Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.894407 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865zh" event={"ID":"95576495-a434-415a-98c3-714268b1d0c1","Type":"ContainerStarted","Data":"172fec3be1bb12db5fb411137f84815a20469566c446764adb9dd7a09ae05cd1"} Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.932671 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zvvvr" podStartSLOduration=2.396636681 podStartE2EDuration="4.932645833s" podCreationTimestamp="2026-03-20 13:31:37 +0000 UTC" firstStartedPulling="2026-03-20 13:31:38.829908893 +0000 UTC m=+448.507632288" lastFinishedPulling="2026-03-20 13:31:41.365918045 +0000 UTC m=+451.043641440" observedRunningTime="2026-03-20 13:31:41.910407322 +0000 UTC m=+451.588130717" watchObservedRunningTime="2026-03-20 13:31:41.932645833 +0000 UTC m=+451.610369228" Mar 20 13:31:41 crc kubenswrapper[4849]: I0320 13:31:41.935796 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hkn9l" podStartSLOduration=2.496141218 podStartE2EDuration="4.93578013s" podCreationTimestamp="2026-03-20 13:31:37 +0000 UTC" firstStartedPulling="2026-03-20 13:31:38.827115905 +0000 UTC m=+448.504839300" lastFinishedPulling="2026-03-20 13:31:41.266754817 +0000 UTC m=+450.944478212" observedRunningTime="2026-03-20 13:31:41.930841922 +0000 UTC m=+451.608565317" watchObservedRunningTime="2026-03-20 13:31:41.93578013 +0000 UTC m=+451.613503525" Mar 20 13:31:42 crc kubenswrapper[4849]: I0320 13:31:42.919611 4849 generic.go:334] "Generic (PLEG): container finished" podID="00f0000c-6337-492e-928d-047fcbf4dfc5" containerID="088e3fdbb9ed39ffcc1e143a71ded49d2dfcb0664b8e44e8c87b6cd9f17b6134" exitCode=0 Mar 20 13:31:42 crc kubenswrapper[4849]: I0320 13:31:42.919698 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h596s" event={"ID":"00f0000c-6337-492e-928d-047fcbf4dfc5","Type":"ContainerDied","Data":"088e3fdbb9ed39ffcc1e143a71ded49d2dfcb0664b8e44e8c87b6cd9f17b6134"} Mar 20 13:31:43 crc kubenswrapper[4849]: I0320 13:31:43.926999 4849 generic.go:334] "Generic (PLEG): container finished" podID="95576495-a434-415a-98c3-714268b1d0c1" containerID="1768a3327c111981d8baa71692c10562fa8fa97e2717cf56b9d7e6260236a666" exitCode=0 Mar 20 13:31:43 crc kubenswrapper[4849]: I0320 13:31:43.927040 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865zh" event={"ID":"95576495-a434-415a-98c3-714268b1d0c1","Type":"ContainerDied","Data":"1768a3327c111981d8baa71692c10562fa8fa97e2717cf56b9d7e6260236a666"} Mar 20 13:31:44 crc kubenswrapper[4849]: I0320 13:31:44.935977 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865zh" event={"ID":"95576495-a434-415a-98c3-714268b1d0c1","Type":"ContainerStarted","Data":"fa310c57e06154256f4ff2a0f6070eb5919345e8b80d5f1d2bff6b233e311c88"} Mar 20 13:31:44 crc kubenswrapper[4849]: I0320 13:31:44.958056 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-865zh" podStartSLOduration=2.40550746 podStartE2EDuration="4.958036613s" podCreationTimestamp="2026-03-20 13:31:40 +0000 UTC" firstStartedPulling="2026-03-20 13:31:41.895677271 +0000 UTC m=+451.573400666" lastFinishedPulling="2026-03-20 13:31:44.448206424 +0000 UTC m=+454.125929819" observedRunningTime="2026-03-20 13:31:44.953582909 +0000 UTC m=+454.631306314" watchObservedRunningTime="2026-03-20 13:31:44.958036613 +0000 UTC m=+454.635760008" Mar 20 13:31:45 crc kubenswrapper[4849]: I0320 13:31:45.951178 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h596s" event={"ID":"00f0000c-6337-492e-928d-047fcbf4dfc5","Type":"ContainerStarted","Data":"8a7b23ad311a7cfe8734c2eab4d5fd48c3a1cd07df48e1c3a98a5c66c1a5945a"} Mar 20 13:31:45 crc kubenswrapper[4849]: I0320 13:31:45.981533 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h596s" podStartSLOduration=2.595828292 podStartE2EDuration="6.98151156s" podCreationTimestamp="2026-03-20 13:31:39 +0000 UTC" firstStartedPulling="2026-03-20 13:31:40.8745431 +0000 UTC m=+450.552266505" lastFinishedPulling="2026-03-20 13:31:45.260226388 +0000 UTC m=+454.937949773" observedRunningTime="2026-03-20 13:31:45.974023171 +0000 UTC m=+455.651746566" watchObservedRunningTime="2026-03-20 13:31:45.98151156 +0000 UTC m=+455.659234965" Mar 20 13:31:47 crc kubenswrapper[4849]: I0320 13:31:47.570199 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:47 crc kubenswrapper[4849]: I0320 13:31:47.570267 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:47 crc kubenswrapper[4849]: I0320 13:31:47.616281 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:47 crc kubenswrapper[4849]: I0320 13:31:47.794249 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:47 crc kubenswrapper[4849]: I0320 13:31:47.794366 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:47 crc kubenswrapper[4849]: I0320 13:31:47.840130 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:48 crc kubenswrapper[4849]: I0320 13:31:48.011110 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hkn9l" Mar 20 13:31:48 crc kubenswrapper[4849]: I0320 13:31:48.013808 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zvvvr" Mar 20 13:31:49 crc kubenswrapper[4849]: I0320 13:31:49.423191 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:49 crc kubenswrapper[4849]: I0320 13:31:49.423526 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:49 crc kubenswrapper[4849]: I0320 13:31:49.469007 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:31:50 crc kubenswrapper[4849]: I0320 13:31:50.759733 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:50 crc kubenswrapper[4849]: I0320 13:31:50.760904 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:31:51 crc kubenswrapper[4849]: I0320 13:31:51.802119 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-865zh" podUID="95576495-a434-415a-98c3-714268b1d0c1" containerName="registry-server" probeResult="failure" output=< Mar 20 13:31:51 crc kubenswrapper[4849]: timeout: failed to connect service ":50051" within 1s Mar 20 13:31:51 crc kubenswrapper[4849]: > Mar 20 13:31:59 crc kubenswrapper[4849]: I0320 13:31:59.492727 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h596s" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.132912 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566892-n2d8k"] Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.134091 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.138752 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.139618 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.140336 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.149104 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-n2d8k"] Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.170774 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbn57\" (UniqueName: \"kubernetes.io/projected/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b-kube-api-access-sbn57\") pod \"auto-csr-approver-29566892-n2d8k\" (UID: \"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b\") " pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.272737 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbn57\" (UniqueName: \"kubernetes.io/projected/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b-kube-api-access-sbn57\") pod \"auto-csr-approver-29566892-n2d8k\" (UID: \"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b\") " pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.297677 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbn57\" (UniqueName: \"kubernetes.io/projected/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b-kube-api-access-sbn57\") pod \"auto-csr-approver-29566892-n2d8k\" (UID: \"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b\") " pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.463805 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.705168 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-n2d8k"] Mar 20 13:32:00 crc kubenswrapper[4849]: W0320 13:32:00.713043 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258d5ef3_c8b4_4b41_bd1f_8c742d2edd9b.slice/crio-3c871ed87070a6231466c8859c2739b9346f5dfef26923f731555731e1f070e6 WatchSource:0}: Error finding container 3c871ed87070a6231466c8859c2739b9346f5dfef26923f731555731e1f070e6: Status 404 returned error can't find the container with id 3c871ed87070a6231466c8859c2739b9346f5dfef26923f731555731e1f070e6 Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.803857 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:32:00 crc kubenswrapper[4849]: I0320 13:32:00.844767 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-865zh" Mar 20 13:32:01 crc kubenswrapper[4849]: I0320 13:32:01.058691 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" event={"ID":"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b","Type":"ContainerStarted","Data":"3c871ed87070a6231466c8859c2739b9346f5dfef26923f731555731e1f070e6"} Mar 20 13:32:03 crc kubenswrapper[4849]: I0320 13:32:03.073322 4849 generic.go:334] "Generic (PLEG): container finished" podID="258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b" containerID="926410ea79e7d0fe1269b3af88e4d6f3cd330020079ba9251447277e91d6a84e" exitCode=0 Mar 20 13:32:03 crc kubenswrapper[4849]: I0320 13:32:03.073426 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" event={"ID":"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b","Type":"ContainerDied","Data":"926410ea79e7d0fe1269b3af88e4d6f3cd330020079ba9251447277e91d6a84e"} Mar 20 13:32:04 crc kubenswrapper[4849]: I0320 13:32:04.381151 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:04 crc kubenswrapper[4849]: I0320 13:32:04.470804 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbn57\" (UniqueName: \"kubernetes.io/projected/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b-kube-api-access-sbn57\") pod \"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b\" (UID: \"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b\") " Mar 20 13:32:04 crc kubenswrapper[4849]: I0320 13:32:04.477228 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b-kube-api-access-sbn57" (OuterVolumeSpecName: "kube-api-access-sbn57") pod "258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b" (UID: "258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b"). InnerVolumeSpecName "kube-api-access-sbn57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:04 crc kubenswrapper[4849]: I0320 13:32:04.572461 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbn57\" (UniqueName: \"kubernetes.io/projected/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b-kube-api-access-sbn57\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:04 crc kubenswrapper[4849]: I0320 13:32:04.784352 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" podUID="db498458-18d4-4142-b536-3141889616e1" containerName="registry" containerID="cri-o://fa8ae48cdb74c441faf5f22e6576e02830f8d8cb6cb50408a054b25590f0fb24" gracePeriod=30 Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.104151 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" event={"ID":"258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b","Type":"ContainerDied","Data":"3c871ed87070a6231466c8859c2739b9346f5dfef26923f731555731e1f070e6"} Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.104614 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c871ed87070a6231466c8859c2739b9346f5dfef26923f731555731e1f070e6" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.104705 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-n2d8k" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.111217 4849 generic.go:334] "Generic (PLEG): container finished" podID="db498458-18d4-4142-b536-3141889616e1" containerID="fa8ae48cdb74c441faf5f22e6576e02830f8d8cb6cb50408a054b25590f0fb24" exitCode=0 Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.111255 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" event={"ID":"db498458-18d4-4142-b536-3141889616e1","Type":"ContainerDied","Data":"fa8ae48cdb74c441faf5f22e6576e02830f8d8cb6cb50408a054b25590f0fb24"} Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.195471 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281048 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w95sg\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-kube-api-access-w95sg\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281102 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-registry-tls\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281152 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db498458-18d4-4142-b536-3141889616e1-ca-trust-extracted\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281191 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-bound-sa-token\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281220 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-trusted-ca\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281264 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-registry-certificates\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281308 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db498458-18d4-4142-b536-3141889616e1-installation-pull-secrets\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.281468 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"db498458-18d4-4142-b536-3141889616e1\" (UID: \"db498458-18d4-4142-b536-3141889616e1\") " Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.282767 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.286337 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db498458-18d4-4142-b536-3141889616e1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.288966 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.289129 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-kube-api-access-w95sg" (OuterVolumeSpecName: "kube-api-access-w95sg") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "kube-api-access-w95sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.289055 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.291365 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.300719 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.307402 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db498458-18d4-4142-b536-3141889616e1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "db498458-18d4-4142-b536-3141889616e1" (UID: "db498458-18d4-4142-b536-3141889616e1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382466 4849 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db498458-18d4-4142-b536-3141889616e1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382522 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w95sg\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-kube-api-access-w95sg\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382538 4849 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382558 4849 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db498458-18d4-4142-b536-3141889616e1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382572 4849 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db498458-18d4-4142-b536-3141889616e1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382585 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.382597 4849 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db498458-18d4-4142-b536-3141889616e1-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.455538 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-7cjjt"] Mar 20 13:32:05 crc kubenswrapper[4849]: I0320 13:32:05.463140 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-7cjjt"] Mar 20 13:32:06 crc kubenswrapper[4849]: I0320 13:32:06.117604 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" event={"ID":"db498458-18d4-4142-b536-3141889616e1","Type":"ContainerDied","Data":"0b013f46734ac74622304560496562bd5079999bd0eab925dc5aa2e9c338bbd4"} Mar 20 13:32:06 crc kubenswrapper[4849]: I0320 13:32:06.118022 4849 scope.go:117] "RemoveContainer" containerID="fa8ae48cdb74c441faf5f22e6576e02830f8d8cb6cb50408a054b25590f0fb24" Mar 20 13:32:06 crc kubenswrapper[4849]: I0320 13:32:06.118139 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:32:07 crc kubenswrapper[4849]: I0320 13:32:07.042587 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4855b8cf-a062-487c-bf23-49fd7f919e7a" path="/var/lib/kubelet/pods/4855b8cf-a062-487c-bf23-49fd7f919e7a/volumes" Mar 20 13:32:09 crc kubenswrapper[4849]: I0320 13:32:09.384240 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:32:09 crc kubenswrapper[4849]: I0320 13:32:09.384629 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4849]: I0320 13:32:09.384679 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:32:09 crc kubenswrapper[4849]: I0320 13:32:09.385529 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f14fd97e3ce2e8670a2db11e0c02e2fc9f8ae8117a289b185f11ac430a24ae2d"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:32:09 crc kubenswrapper[4849]: I0320 13:32:09.385628 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://f14fd97e3ce2e8670a2db11e0c02e2fc9f8ae8117a289b185f11ac430a24ae2d" gracePeriod=600 Mar 20 13:32:10 crc kubenswrapper[4849]: I0320 13:32:10.145144 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="f14fd97e3ce2e8670a2db11e0c02e2fc9f8ae8117a289b185f11ac430a24ae2d" exitCode=0 Mar 20 13:32:10 crc kubenswrapper[4849]: I0320 13:32:10.145205 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"f14fd97e3ce2e8670a2db11e0c02e2fc9f8ae8117a289b185f11ac430a24ae2d"} Mar 20 13:32:10 crc kubenswrapper[4849]: I0320 13:32:10.145519 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"130aa48b337a88e578102daf38d6fb66cf9cae0791d30e61767b78bd10649ad0"} Mar 20 13:32:10 crc kubenswrapper[4849]: I0320 13:32:10.145546 4849 scope.go:117] "RemoveContainer" containerID="25e23d152e4e9d6eb6cdacbd0ef44ea64861ec6dc3f436c96eeb9a19e3980daa" Mar 20 13:32:36 crc kubenswrapper[4849]: I0320 13:32:36.132057 4849 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","poddb498458-18d4-4142-b536-3141889616e1"] err="unable to destroy cgroup paths for cgroup [kubepods burstable poddb498458-18d4-4142-b536-3141889616e1] : Timed out while waiting for systemd to remove kubepods-burstable-poddb498458_18d4_4142_b536_3141889616e1.slice" Mar 20 13:32:36 crc kubenswrapper[4849]: E0320 13:32:36.132597 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable poddb498458-18d4-4142-b536-3141889616e1] : unable to destroy cgroup paths for cgroup [kubepods burstable poddb498458-18d4-4142-b536-3141889616e1] : Timed out while waiting for systemd to remove kubepods-burstable-poddb498458_18d4_4142_b536_3141889616e1.slice" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" podUID="db498458-18d4-4142-b536-3141889616e1" Mar 20 13:32:36 crc kubenswrapper[4849]: I0320 13:32:36.313612 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ttnt5" Mar 20 13:32:36 crc kubenswrapper[4849]: I0320 13:32:36.383255 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ttnt5"] Mar 20 13:32:36 crc kubenswrapper[4849]: I0320 13:32:36.390491 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ttnt5"] Mar 20 13:32:37 crc kubenswrapper[4849]: I0320 13:32:37.042737 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db498458-18d4-4142-b536-3141889616e1" path="/var/lib/kubelet/pods/db498458-18d4-4142-b536-3141889616e1/volumes" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.143388 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566894-rdk6s"] Mar 20 13:34:00 crc kubenswrapper[4849]: E0320 13:34:00.144239 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.144251 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4849]: E0320 13:34:00.144275 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db498458-18d4-4142-b536-3141889616e1" containerName="registry" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.144281 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="db498458-18d4-4142-b536-3141889616e1" containerName="registry" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.144366 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="db498458-18d4-4142-b536-3141889616e1" containerName="registry" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.144379 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.144812 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.147682 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.147906 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.152484 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-rdk6s"] Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.155264 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7lj\" (UniqueName: \"kubernetes.io/projected/6980a35c-419b-4198-8d99-788b37127584-kube-api-access-vl7lj\") pod \"auto-csr-approver-29566894-rdk6s\" (UID: \"6980a35c-419b-4198-8d99-788b37127584\") " pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.157299 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.255746 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7lj\" (UniqueName: \"kubernetes.io/projected/6980a35c-419b-4198-8d99-788b37127584-kube-api-access-vl7lj\") pod \"auto-csr-approver-29566894-rdk6s\" (UID: \"6980a35c-419b-4198-8d99-788b37127584\") " pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.275790 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7lj\" (UniqueName: \"kubernetes.io/projected/6980a35c-419b-4198-8d99-788b37127584-kube-api-access-vl7lj\") pod \"auto-csr-approver-29566894-rdk6s\" (UID: \"6980a35c-419b-4198-8d99-788b37127584\") " pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.462478 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.665225 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-rdk6s"] Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.676347 4849 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:34:00 crc kubenswrapper[4849]: I0320 13:34:00.826048 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" event={"ID":"6980a35c-419b-4198-8d99-788b37127584","Type":"ContainerStarted","Data":"09333a4a23c385ad807851b538b8698c457f424cd70018928f9eba4eb1e17c41"} Mar 20 13:34:02 crc kubenswrapper[4849]: I0320 13:34:02.841934 4849 generic.go:334] "Generic (PLEG): container finished" podID="6980a35c-419b-4198-8d99-788b37127584" containerID="a358dfe2a6c00c79f8bc73dd703386dfb4767430cef26ca2fec949fb6093ad30" exitCode=0 Mar 20 13:34:02 crc kubenswrapper[4849]: I0320 13:34:02.842055 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" event={"ID":"6980a35c-419b-4198-8d99-788b37127584","Type":"ContainerDied","Data":"a358dfe2a6c00c79f8bc73dd703386dfb4767430cef26ca2fec949fb6093ad30"} Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.084160 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.105242 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl7lj\" (UniqueName: \"kubernetes.io/projected/6980a35c-419b-4198-8d99-788b37127584-kube-api-access-vl7lj\") pod \"6980a35c-419b-4198-8d99-788b37127584\" (UID: \"6980a35c-419b-4198-8d99-788b37127584\") " Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.111223 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6980a35c-419b-4198-8d99-788b37127584-kube-api-access-vl7lj" (OuterVolumeSpecName: "kube-api-access-vl7lj") pod "6980a35c-419b-4198-8d99-788b37127584" (UID: "6980a35c-419b-4198-8d99-788b37127584"). InnerVolumeSpecName "kube-api-access-vl7lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.206253 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl7lj\" (UniqueName: \"kubernetes.io/projected/6980a35c-419b-4198-8d99-788b37127584-kube-api-access-vl7lj\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.854050 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" event={"ID":"6980a35c-419b-4198-8d99-788b37127584","Type":"ContainerDied","Data":"09333a4a23c385ad807851b538b8698c457f424cd70018928f9eba4eb1e17c41"} Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.854291 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09333a4a23c385ad807851b538b8698c457f424cd70018928f9eba4eb1e17c41" Mar 20 13:34:04 crc kubenswrapper[4849]: I0320 13:34:04.854119 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-rdk6s" Mar 20 13:34:05 crc kubenswrapper[4849]: I0320 13:34:05.153081 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-9ps6c"] Mar 20 13:34:05 crc kubenswrapper[4849]: I0320 13:34:05.157131 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-9ps6c"] Mar 20 13:34:07 crc kubenswrapper[4849]: I0320 13:34:07.048507 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b89502e-d430-490b-83df-7e4ba6393a51" path="/var/lib/kubelet/pods/5b89502e-d430-490b-83df-7e4ba6393a51/volumes" Mar 20 13:34:09 crc kubenswrapper[4849]: I0320 13:34:09.384704 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:09 crc kubenswrapper[4849]: I0320 13:34:09.384767 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:32 crc kubenswrapper[4849]: I0320 13:34:32.606278 4849 scope.go:117] "RemoveContainer" containerID="d5e5dc27ea89bc2e39a9b38a4fe842de9ec1359c142d55a265b10eca6c9dc696" Mar 20 13:34:32 crc kubenswrapper[4849]: I0320 13:34:32.655585 4849 scope.go:117] "RemoveContainer" containerID="15d66e4d503a600034c2181c1e5e6d59b92ac9ebd8ca07e95f2c186ab36b47b1" Mar 20 13:34:39 crc kubenswrapper[4849]: I0320 13:34:39.385214 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:39 crc kubenswrapper[4849]: I0320 13:34:39.385619 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.385254 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.387005 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.387080 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.387657 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"130aa48b337a88e578102daf38d6fb66cf9cae0791d30e61767b78bd10649ad0"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.387722 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://130aa48b337a88e578102daf38d6fb66cf9cae0791d30e61767b78bd10649ad0" gracePeriod=600 Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.529396 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="130aa48b337a88e578102daf38d6fb66cf9cae0791d30e61767b78bd10649ad0" exitCode=0 Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.529496 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"130aa48b337a88e578102daf38d6fb66cf9cae0791d30e61767b78bd10649ad0"} Mar 20 13:35:09 crc kubenswrapper[4849]: I0320 13:35:09.529592 4849 scope.go:117] "RemoveContainer" containerID="f14fd97e3ce2e8670a2db11e0c02e2fc9f8ae8117a289b185f11ac430a24ae2d" Mar 20 13:35:10 crc kubenswrapper[4849]: I0320 13:35:10.542765 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"b38365e3077d108f503fa5f04333e6db13420f95bbb3b1017d115a7dd6908444"} Mar 20 13:35:32 crc kubenswrapper[4849]: I0320 13:35:32.722225 4849 scope.go:117] "RemoveContainer" containerID="56a5e4f2eecb8bc33b9643e125028ea6b5b1874a4c9f763af511e38e45ed0278" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.144617 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566896-vljgv"] Mar 20 13:36:00 crc kubenswrapper[4849]: E0320 13:36:00.145420 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6980a35c-419b-4198-8d99-788b37127584" containerName="oc" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.145437 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6980a35c-419b-4198-8d99-788b37127584" containerName="oc" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.145564 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6980a35c-419b-4198-8d99-788b37127584" containerName="oc" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.146100 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.149452 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.149452 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.149544 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.158227 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-vljgv"] Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.273507 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgfrx\" (UniqueName: \"kubernetes.io/projected/b4518bf0-8998-47ae-bf3b-8920118e0aed-kube-api-access-bgfrx\") pod \"auto-csr-approver-29566896-vljgv\" (UID: \"b4518bf0-8998-47ae-bf3b-8920118e0aed\") " pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.374865 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgfrx\" (UniqueName: \"kubernetes.io/projected/b4518bf0-8998-47ae-bf3b-8920118e0aed-kube-api-access-bgfrx\") pod \"auto-csr-approver-29566896-vljgv\" (UID: \"b4518bf0-8998-47ae-bf3b-8920118e0aed\") " pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.399571 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgfrx\" (UniqueName: \"kubernetes.io/projected/b4518bf0-8998-47ae-bf3b-8920118e0aed-kube-api-access-bgfrx\") pod \"auto-csr-approver-29566896-vljgv\" (UID: \"b4518bf0-8998-47ae-bf3b-8920118e0aed\") " pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.468331 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:00 crc kubenswrapper[4849]: I0320 13:36:00.923194 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-vljgv"] Mar 20 13:36:01 crc kubenswrapper[4849]: I0320 13:36:01.888526 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-vljgv" event={"ID":"b4518bf0-8998-47ae-bf3b-8920118e0aed","Type":"ContainerStarted","Data":"9644065faec9b3f6051e813a2bc69195d0a3833d615a982524785579ecf777fb"} Mar 20 13:36:02 crc kubenswrapper[4849]: I0320 13:36:02.901623 4849 generic.go:334] "Generic (PLEG): container finished" podID="b4518bf0-8998-47ae-bf3b-8920118e0aed" containerID="a187921fc9e6295fb44772d7155a17e1dd78c0c08b5229c24f8089f552f92938" exitCode=0 Mar 20 13:36:02 crc kubenswrapper[4849]: I0320 13:36:02.901700 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-vljgv" event={"ID":"b4518bf0-8998-47ae-bf3b-8920118e0aed","Type":"ContainerDied","Data":"a187921fc9e6295fb44772d7155a17e1dd78c0c08b5229c24f8089f552f92938"} Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.180501 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.227694 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgfrx\" (UniqueName: \"kubernetes.io/projected/b4518bf0-8998-47ae-bf3b-8920118e0aed-kube-api-access-bgfrx\") pod \"b4518bf0-8998-47ae-bf3b-8920118e0aed\" (UID: \"b4518bf0-8998-47ae-bf3b-8920118e0aed\") " Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.234992 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4518bf0-8998-47ae-bf3b-8920118e0aed-kube-api-access-bgfrx" (OuterVolumeSpecName: "kube-api-access-bgfrx") pod "b4518bf0-8998-47ae-bf3b-8920118e0aed" (UID: "b4518bf0-8998-47ae-bf3b-8920118e0aed"). InnerVolumeSpecName "kube-api-access-bgfrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.329429 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgfrx\" (UniqueName: \"kubernetes.io/projected/b4518bf0-8998-47ae-bf3b-8920118e0aed-kube-api-access-bgfrx\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.920398 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-vljgv" event={"ID":"b4518bf0-8998-47ae-bf3b-8920118e0aed","Type":"ContainerDied","Data":"9644065faec9b3f6051e813a2bc69195d0a3833d615a982524785579ecf777fb"} Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.920460 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9644065faec9b3f6051e813a2bc69195d0a3833d615a982524785579ecf777fb" Mar 20 13:36:04 crc kubenswrapper[4849]: I0320 13:36:04.920502 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-vljgv" Mar 20 13:36:05 crc kubenswrapper[4849]: I0320 13:36:05.270251 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-zrc5b"] Mar 20 13:36:05 crc kubenswrapper[4849]: I0320 13:36:05.279037 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-zrc5b"] Mar 20 13:36:07 crc kubenswrapper[4849]: I0320 13:36:07.050452 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ad70d6-39b4-4488-99a2-34b33c249a5a" path="/var/lib/kubelet/pods/d5ad70d6-39b4-4488-99a2-34b33c249a5a/volumes" Mar 20 13:36:32 crc kubenswrapper[4849]: I0320 13:36:32.777101 4849 scope.go:117] "RemoveContainer" containerID="6d77d5825b322fe27bfce4d8d87e56d10e65d9f2d6805122b94bc71d4bd8b64a" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.001981 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb"] Mar 20 13:37:06 crc kubenswrapper[4849]: E0320 13:37:06.002708 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4518bf0-8998-47ae-bf3b-8920118e0aed" containerName="oc" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.002720 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4518bf0-8998-47ae-bf3b-8920118e0aed" containerName="oc" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.002810 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4518bf0-8998-47ae-bf3b-8920118e0aed" containerName="oc" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.003178 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.005040 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-z5lkb"] Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.006093 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-z5lkb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.007224 4849 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jzj6h" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.007268 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.007473 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.008097 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb"] Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.013554 4849 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zgfhk" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.047937 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-pgjlm"] Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.048892 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.050512 4849 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9pxnq" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.053657 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-z5lkb"] Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.068685 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-pgjlm"] Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.076052 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcm4\" (UniqueName: \"kubernetes.io/projected/f65771ac-fff2-4237-b3de-bff20fdda5d1-kube-api-access-wbcm4\") pod \"cert-manager-webhook-687f57d79b-pgjlm\" (UID: \"f65771ac-fff2-4237-b3de-bff20fdda5d1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.076110 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrqx\" (UniqueName: \"kubernetes.io/projected/d73cc10b-a789-4f78-8f7e-23e5fef49ae5-kube-api-access-pxrqx\") pod \"cert-manager-858654f9db-z5lkb\" (UID: \"d73cc10b-a789-4f78-8f7e-23e5fef49ae5\") " pod="cert-manager/cert-manager-858654f9db-z5lkb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.076172 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tgh4\" (UniqueName: \"kubernetes.io/projected/602a29bd-b8ec-4538-bdfe-ae7a2bd7149c-kube-api-access-5tgh4\") pod \"cert-manager-cainjector-cf98fcc89-x9nbb\" (UID: \"602a29bd-b8ec-4538-bdfe-ae7a2bd7149c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.177150 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tgh4\" (UniqueName: \"kubernetes.io/projected/602a29bd-b8ec-4538-bdfe-ae7a2bd7149c-kube-api-access-5tgh4\") pod \"cert-manager-cainjector-cf98fcc89-x9nbb\" (UID: \"602a29bd-b8ec-4538-bdfe-ae7a2bd7149c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.177199 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcm4\" (UniqueName: \"kubernetes.io/projected/f65771ac-fff2-4237-b3de-bff20fdda5d1-kube-api-access-wbcm4\") pod \"cert-manager-webhook-687f57d79b-pgjlm\" (UID: \"f65771ac-fff2-4237-b3de-bff20fdda5d1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.177234 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrqx\" (UniqueName: \"kubernetes.io/projected/d73cc10b-a789-4f78-8f7e-23e5fef49ae5-kube-api-access-pxrqx\") pod \"cert-manager-858654f9db-z5lkb\" (UID: \"d73cc10b-a789-4f78-8f7e-23e5fef49ae5\") " pod="cert-manager/cert-manager-858654f9db-z5lkb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.199818 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcm4\" (UniqueName: \"kubernetes.io/projected/f65771ac-fff2-4237-b3de-bff20fdda5d1-kube-api-access-wbcm4\") pod \"cert-manager-webhook-687f57d79b-pgjlm\" (UID: \"f65771ac-fff2-4237-b3de-bff20fdda5d1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.205423 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tgh4\" (UniqueName: \"kubernetes.io/projected/602a29bd-b8ec-4538-bdfe-ae7a2bd7149c-kube-api-access-5tgh4\") pod \"cert-manager-cainjector-cf98fcc89-x9nbb\" (UID: \"602a29bd-b8ec-4538-bdfe-ae7a2bd7149c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.206079 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrqx\" (UniqueName: \"kubernetes.io/projected/d73cc10b-a789-4f78-8f7e-23e5fef49ae5-kube-api-access-pxrqx\") pod \"cert-manager-858654f9db-z5lkb\" (UID: \"d73cc10b-a789-4f78-8f7e-23e5fef49ae5\") " pod="cert-manager/cert-manager-858654f9db-z5lkb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.327687 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.349148 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-z5lkb" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.371168 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.533972 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb"] Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.579675 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-z5lkb"] Mar 20 13:37:06 crc kubenswrapper[4849]: W0320 13:37:06.585101 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd73cc10b_a789_4f78_8f7e_23e5fef49ae5.slice/crio-6f27f22c1b417c31a0198374ec9eb89314c49a9b1b48ad5eb42bdc4496439801 WatchSource:0}: Error finding container 6f27f22c1b417c31a0198374ec9eb89314c49a9b1b48ad5eb42bdc4496439801: Status 404 returned error can't find the container with id 6f27f22c1b417c31a0198374ec9eb89314c49a9b1b48ad5eb42bdc4496439801 Mar 20 13:37:06 crc kubenswrapper[4849]: I0320 13:37:06.639072 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-pgjlm"] Mar 20 13:37:06 crc kubenswrapper[4849]: W0320 13:37:06.641844 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf65771ac_fff2_4237_b3de_bff20fdda5d1.slice/crio-075f5d0b95ea551fee6634186b802bd9eeadcb4dec0c65dcfa7fef0ef430ca91 WatchSource:0}: Error finding container 075f5d0b95ea551fee6634186b802bd9eeadcb4dec0c65dcfa7fef0ef430ca91: Status 404 returned error can't find the container with id 075f5d0b95ea551fee6634186b802bd9eeadcb4dec0c65dcfa7fef0ef430ca91 Mar 20 13:37:07 crc kubenswrapper[4849]: I0320 13:37:07.331489 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-z5lkb" event={"ID":"d73cc10b-a789-4f78-8f7e-23e5fef49ae5","Type":"ContainerStarted","Data":"6f27f22c1b417c31a0198374ec9eb89314c49a9b1b48ad5eb42bdc4496439801"} Mar 20 13:37:07 crc kubenswrapper[4849]: I0320 13:37:07.334333 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" event={"ID":"f65771ac-fff2-4237-b3de-bff20fdda5d1","Type":"ContainerStarted","Data":"075f5d0b95ea551fee6634186b802bd9eeadcb4dec0c65dcfa7fef0ef430ca91"} Mar 20 13:37:07 crc kubenswrapper[4849]: I0320 13:37:07.336354 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" event={"ID":"602a29bd-b8ec-4538-bdfe-ae7a2bd7149c","Type":"ContainerStarted","Data":"5b9f852fb49a9029b3ffbc9e5e2af417fa7e4edd66981502c9a55232e207ce49"} Mar 20 13:37:09 crc kubenswrapper[4849]: I0320 13:37:09.384532 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:37:09 crc kubenswrapper[4849]: I0320 13:37:09.385047 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.354188 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-z5lkb" event={"ID":"d73cc10b-a789-4f78-8f7e-23e5fef49ae5","Type":"ContainerStarted","Data":"6aa84608c42144cb4d84e79ecb68c1b06d8f7e49d64547c5228b69164434f106"} Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.356422 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" event={"ID":"f65771ac-fff2-4237-b3de-bff20fdda5d1","Type":"ContainerStarted","Data":"097de051f61dfd487dc202b5b51bd234c769a9567d5e37cf8d011314fcf7f5be"} Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.356742 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.359645 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" event={"ID":"602a29bd-b8ec-4538-bdfe-ae7a2bd7149c","Type":"ContainerStarted","Data":"02d9fb2774e78e95a8ea5ea7fc8f60b55b6106f69e1304319e97334ed8867f17"} Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.371165 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-z5lkb" podStartSLOduration=2.523984401 podStartE2EDuration="5.371136422s" podCreationTimestamp="2026-03-20 13:37:05 +0000 UTC" firstStartedPulling="2026-03-20 13:37:06.587870695 +0000 UTC m=+776.265594080" lastFinishedPulling="2026-03-20 13:37:09.435022706 +0000 UTC m=+779.112746101" observedRunningTime="2026-03-20 13:37:10.365774005 +0000 UTC m=+780.043497420" watchObservedRunningTime="2026-03-20 13:37:10.371136422 +0000 UTC m=+780.048859817" Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.384306 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" podStartSLOduration=0.974736345 podStartE2EDuration="4.384273153s" podCreationTimestamp="2026-03-20 13:37:06 +0000 UTC" firstStartedPulling="2026-03-20 13:37:06.643755636 +0000 UTC m=+776.321479051" lastFinishedPulling="2026-03-20 13:37:10.053292464 +0000 UTC m=+779.731015859" observedRunningTime="2026-03-20 13:37:10.383204573 +0000 UTC m=+780.060928008" watchObservedRunningTime="2026-03-20 13:37:10.384273153 +0000 UTC m=+780.061996558" Mar 20 13:37:10 crc kubenswrapper[4849]: I0320 13:37:10.403954 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-x9nbb" podStartSLOduration=1.959485029 podStartE2EDuration="5.403933413s" podCreationTimestamp="2026-03-20 13:37:05 +0000 UTC" firstStartedPulling="2026-03-20 13:37:06.555178436 +0000 UTC m=+776.232901831" lastFinishedPulling="2026-03-20 13:37:09.99962682 +0000 UTC m=+779.677350215" observedRunningTime="2026-03-20 13:37:10.402199205 +0000 UTC m=+780.079922670" watchObservedRunningTime="2026-03-20 13:37:10.403933413 +0000 UTC m=+780.081656808" Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.715446 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7z7ql"] Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.716887 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-controller" containerID="cri-o://13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.717525 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="sbdb" containerID="cri-o://96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.717585 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="nbdb" containerID="cri-o://25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.717636 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="northd" containerID="cri-o://4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.717685 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.717730 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-node" containerID="cri-o://a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.717774 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-acl-logging" containerID="cri-o://d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" gracePeriod=30 Mar 20 13:37:15 crc kubenswrapper[4849]: I0320 13:37:15.773305 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" containerID="cri-o://5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" gracePeriod=30 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.061059 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/3.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.063024 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovn-acl-logging/0.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.063590 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovn-controller/0.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.064167 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117129 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-ovn-kubernetes\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117168 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-bin\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117190 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117208 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-netd\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117231 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-config\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117270 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-ovn\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117293 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-etc-openvswitch\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117318 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-systemd\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117338 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-node-log\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117376 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-openvswitch\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117389 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-log-socket\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117409 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-env-overrides\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117427 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ba9a25c-6156-4c78-a394-60507829eced-ovn-node-metrics-cert\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117440 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-systemd-units\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117455 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-script-lib\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117472 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bh57\" (UniqueName: \"kubernetes.io/projected/0ba9a25c-6156-4c78-a394-60507829eced-kube-api-access-7bh57\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117490 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-slash\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117557 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-netns\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117573 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-var-lib-openvswitch\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.117606 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-kubelet\") pod \"0ba9a25c-6156-4c78-a394-60507829eced\" (UID: \"0ba9a25c-6156-4c78-a394-60507829eced\") " Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118565 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118612 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118644 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118670 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118695 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118707 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.118805 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119022 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119066 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119094 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119284 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119531 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119578 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119601 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-node-log" (OuterVolumeSpecName: "node-log") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119621 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-log-socket" (OuterVolumeSpecName: "log-socket") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119625 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-slash" (OuterVolumeSpecName: "host-slash") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.119642 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122488 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hm6d5"] Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122750 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122769 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122782 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122790 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122799 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122808 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122836 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kubecfg-setup" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122844 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kubecfg-setup" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122857 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122865 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122874 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122882 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122890 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="northd" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122898 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="northd" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122909 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-node" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122916 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-node" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122926 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122934 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122945 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-acl-logging" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122953 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-acl-logging" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122961 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="sbdb" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122968 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="sbdb" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.122979 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="nbdb" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.122987 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="nbdb" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123103 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="nbdb" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123117 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123124 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="sbdb" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123134 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123142 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123151 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-acl-logging" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123163 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="kube-rbac-proxy-node" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123171 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovn-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123181 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="northd" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123191 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.123320 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123350 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123391 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba9a25c-6156-4c78-a394-60507829eced-kube-api-access-7bh57" (OuterVolumeSpecName: "kube-api-access-7bh57") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "kube-api-access-7bh57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123468 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123628 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba9a25c-6156-4c78-a394-60507829eced-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.123651 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba9a25c-6156-4c78-a394-60507829eced" containerName="ovnkube-controller" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.125063 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.133337 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0ba9a25c-6156-4c78-a394-60507829eced" (UID: "0ba9a25c-6156-4c78-a394-60507829eced"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218334 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-ovnkube-script-lib\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218455 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-systemd-units\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218544 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-kubelet\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218608 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-log-socket\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218656 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218706 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-cni-netd\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218762 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-var-lib-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218899 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-etc-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.218962 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-systemd\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219042 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d60ceeff-edd9-490a-86af-a630d6a8c702-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219097 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219164 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwmg\" (UniqueName: \"kubernetes.io/projected/d60ceeff-edd9-490a-86af-a630d6a8c702-kube-api-access-6kwmg\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219236 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219364 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-node-log\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219420 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-ovnkube-config\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219488 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-run-netns\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219587 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-ovn\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219632 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-env-overrides\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.219681 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-slash\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220136 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-cni-bin\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220284 4849 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220328 4849 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220346 4849 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220361 4849 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220370 4849 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220383 4849 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220393 4849 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220401 4849 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220413 4849 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0ba9a25c-6156-4c78-a394-60507829eced-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220423 4849 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ba9a25c-6156-4c78-a394-60507829eced-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220432 4849 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220441 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bh57\" (UniqueName: \"kubernetes.io/projected/0ba9a25c-6156-4c78-a394-60507829eced-kube-api-access-7bh57\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220452 4849 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220459 4849 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220469 4849 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220480 4849 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220490 4849 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220499 4849 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220510 4849 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.220520 4849 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0ba9a25c-6156-4c78-a394-60507829eced-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321026 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-log-socket\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321063 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321083 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-kubelet\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321103 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-cni-netd\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321122 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-var-lib-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321145 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-etc-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321171 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-systemd\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321198 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d60ceeff-edd9-490a-86af-a630d6a8c702-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321213 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321206 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-kubelet\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321214 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-log-socket\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321301 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-systemd\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321299 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321230 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwmg\" (UniqueName: \"kubernetes.io/projected/d60ceeff-edd9-490a-86af-a630d6a8c702-kube-api-access-6kwmg\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321309 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-cni-netd\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321282 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321361 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-etc-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321424 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321423 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-var-lib-openvswitch\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321444 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-node-log\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321462 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-ovnkube-config\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321489 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-run-netns\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321499 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-node-log\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321510 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-ovn\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321528 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-env-overrides\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321536 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-run-netns\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321544 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-slash\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321562 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-cni-bin\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321583 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-ovnkube-script-lib\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321548 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321627 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-systemd-units\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321605 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-cni-bin\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321607 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-systemd-units\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321720 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-host-slash\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.321611 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d60ceeff-edd9-490a-86af-a630d6a8c702-run-ovn\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.322413 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-env-overrides\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.322531 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-ovnkube-script-lib\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.323983 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d60ceeff-edd9-490a-86af-a630d6a8c702-ovnkube-config\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.324730 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d60ceeff-edd9-490a-86af-a630d6a8c702-ovn-node-metrics-cert\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.337060 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwmg\" (UniqueName: \"kubernetes.io/projected/d60ceeff-edd9-490a-86af-a630d6a8c702-kube-api-access-6kwmg\") pod \"ovnkube-node-hm6d5\" (UID: \"d60ceeff-edd9-490a-86af-a630d6a8c702\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.374120 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-pgjlm" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.398932 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/2.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.399434 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/1.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.399478 4849 generic.go:334] "Generic (PLEG): container finished" podID="606dc5eb-f89f-41cb-8aa2-f55fcab8f04d" containerID="69558596bddd811517bc5bd607ff8fa66fc36eff63c8acc05cc3b9bc094b4472" exitCode=2 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.399537 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerDied","Data":"69558596bddd811517bc5bd607ff8fa66fc36eff63c8acc05cc3b9bc094b4472"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.399572 4849 scope.go:117] "RemoveContainer" containerID="d1f554eb38b10f82a2dc6d0a57d9a997842f5e2c52c8026ecfd16cebb6606195" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.400132 4849 scope.go:117] "RemoveContainer" containerID="69558596bddd811517bc5bd607ff8fa66fc36eff63c8acc05cc3b9bc094b4472" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.400352 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7nxh7_openshift-multus(606dc5eb-f89f-41cb-8aa2-f55fcab8f04d)\"" pod="openshift-multus/multus-7nxh7" podUID="606dc5eb-f89f-41cb-8aa2-f55fcab8f04d" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.404127 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovnkube-controller/3.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.407343 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovn-acl-logging/0.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408059 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7z7ql_0ba9a25c-6156-4c78-a394-60507829eced/ovn-controller/0.log" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408572 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" exitCode=0 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408607 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" exitCode=0 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408617 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" exitCode=0 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408626 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" exitCode=0 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408636 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" exitCode=0 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408644 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" exitCode=0 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408652 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" exitCode=143 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408660 4849 generic.go:334] "Generic (PLEG): container finished" podID="0ba9a25c-6156-4c78-a394-60507829eced" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" exitCode=143 Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408687 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.408685 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409014 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409033 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409044 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409055 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409067 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409078 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409089 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409095 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409100 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409106 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409111 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409116 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409121 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409126 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409131 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409138 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409145 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409152 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409157 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409162 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409167 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409172 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409177 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409182 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409188 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409193 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409200 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409209 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409214 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409219 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409225 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409232 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409237 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409242 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409247 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409252 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409257 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409264 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7z7ql" event={"ID":"0ba9a25c-6156-4c78-a394-60507829eced","Type":"ContainerDied","Data":"fa2cf4e0ac5699c8e56b34b95e16adc893344ac006dd28aa0c1c51d2ec475922"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409291 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409299 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409305 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409311 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409318 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409322 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409328 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409333 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409339 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.409344 4849 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.432873 4849 scope.go:117] "RemoveContainer" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.439936 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.450154 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7z7ql"] Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.453750 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7z7ql"] Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.461553 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.476457 4849 scope.go:117] "RemoveContainer" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.494576 4849 scope.go:117] "RemoveContainer" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.509702 4849 scope.go:117] "RemoveContainer" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.529468 4849 scope.go:117] "RemoveContainer" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.542244 4849 scope.go:117] "RemoveContainer" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.555190 4849 scope.go:117] "RemoveContainer" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.566862 4849 scope.go:117] "RemoveContainer" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.644889 4849 scope.go:117] "RemoveContainer" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.657039 4849 scope.go:117] "RemoveContainer" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.657525 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": container with ID starting with 5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f not found: ID does not exist" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.657554 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} err="failed to get container status \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": rpc error: code = NotFound desc = could not find container \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": container with ID starting with 5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.657579 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.658117 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": container with ID starting with d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f not found: ID does not exist" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.658139 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} err="failed to get container status \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": rpc error: code = NotFound desc = could not find container \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": container with ID starting with d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.658155 4849 scope.go:117] "RemoveContainer" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.658411 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": container with ID starting with 96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a not found: ID does not exist" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.658434 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} err="failed to get container status \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": rpc error: code = NotFound desc = could not find container \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": container with ID starting with 96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.658450 4849 scope.go:117] "RemoveContainer" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.658647 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": container with ID starting with 25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef not found: ID does not exist" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.658667 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} err="failed to get container status \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": rpc error: code = NotFound desc = could not find container \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": container with ID starting with 25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.658683 4849 scope.go:117] "RemoveContainer" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.659001 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": container with ID starting with 4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b not found: ID does not exist" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659023 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} err="failed to get container status \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": rpc error: code = NotFound desc = could not find container \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": container with ID starting with 4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659038 4849 scope.go:117] "RemoveContainer" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.659248 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": container with ID starting with 55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee not found: ID does not exist" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659272 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} err="failed to get container status \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": rpc error: code = NotFound desc = could not find container \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": container with ID starting with 55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659292 4849 scope.go:117] "RemoveContainer" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.659537 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": container with ID starting with a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19 not found: ID does not exist" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659559 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} err="failed to get container status \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": rpc error: code = NotFound desc = could not find container \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": container with ID starting with a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659575 4849 scope.go:117] "RemoveContainer" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.659779 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": container with ID starting with d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076 not found: ID does not exist" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659803 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} err="failed to get container status \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": rpc error: code = NotFound desc = could not find container \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": container with ID starting with d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.659837 4849 scope.go:117] "RemoveContainer" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.660021 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": container with ID starting with 13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6 not found: ID does not exist" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660041 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} err="failed to get container status \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": rpc error: code = NotFound desc = could not find container \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": container with ID starting with 13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660057 4849 scope.go:117] "RemoveContainer" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" Mar 20 13:37:16 crc kubenswrapper[4849]: E0320 13:37:16.660413 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": container with ID starting with f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b not found: ID does not exist" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660434 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} err="failed to get container status \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": rpc error: code = NotFound desc = could not find container \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": container with ID starting with f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660454 4849 scope.go:117] "RemoveContainer" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660642 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} err="failed to get container status \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": rpc error: code = NotFound desc = could not find container \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": container with ID starting with 5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660660 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660859 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} err="failed to get container status \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": rpc error: code = NotFound desc = could not find container \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": container with ID starting with d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.660880 4849 scope.go:117] "RemoveContainer" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661104 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} err="failed to get container status \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": rpc error: code = NotFound desc = could not find container \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": container with ID starting with 96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661125 4849 scope.go:117] "RemoveContainer" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661294 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} err="failed to get container status \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": rpc error: code = NotFound desc = could not find container \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": container with ID starting with 25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661313 4849 scope.go:117] "RemoveContainer" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661488 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} err="failed to get container status \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": rpc error: code = NotFound desc = could not find container \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": container with ID starting with 4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661511 4849 scope.go:117] "RemoveContainer" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661856 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} err="failed to get container status \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": rpc error: code = NotFound desc = could not find container \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": container with ID starting with 55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.661880 4849 scope.go:117] "RemoveContainer" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662073 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} err="failed to get container status \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": rpc error: code = NotFound desc = could not find container \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": container with ID starting with a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662093 4849 scope.go:117] "RemoveContainer" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662286 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} err="failed to get container status \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": rpc error: code = NotFound desc = could not find container \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": container with ID starting with d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662308 4849 scope.go:117] "RemoveContainer" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662479 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} err="failed to get container status \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": rpc error: code = NotFound desc = could not find container \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": container with ID starting with 13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662596 4849 scope.go:117] "RemoveContainer" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662797 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} err="failed to get container status \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": rpc error: code = NotFound desc = could not find container \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": container with ID starting with f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.662829 4849 scope.go:117] "RemoveContainer" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663001 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} err="failed to get container status \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": rpc error: code = NotFound desc = could not find container \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": container with ID starting with 5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663022 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663205 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} err="failed to get container status \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": rpc error: code = NotFound desc = could not find container \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": container with ID starting with d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663225 4849 scope.go:117] "RemoveContainer" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663393 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} err="failed to get container status \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": rpc error: code = NotFound desc = could not find container \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": container with ID starting with 96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663416 4849 scope.go:117] "RemoveContainer" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663804 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} err="failed to get container status \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": rpc error: code = NotFound desc = could not find container \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": container with ID starting with 25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.663841 4849 scope.go:117] "RemoveContainer" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664044 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} err="failed to get container status \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": rpc error: code = NotFound desc = could not find container \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": container with ID starting with 4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664064 4849 scope.go:117] "RemoveContainer" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664255 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} err="failed to get container status \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": rpc error: code = NotFound desc = could not find container \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": container with ID starting with 55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664305 4849 scope.go:117] "RemoveContainer" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664570 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} err="failed to get container status \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": rpc error: code = NotFound desc = could not find container \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": container with ID starting with a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664591 4849 scope.go:117] "RemoveContainer" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664780 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} err="failed to get container status \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": rpc error: code = NotFound desc = could not find container \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": container with ID starting with d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.664801 4849 scope.go:117] "RemoveContainer" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665015 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} err="failed to get container status \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": rpc error: code = NotFound desc = could not find container \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": container with ID starting with 13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665038 4849 scope.go:117] "RemoveContainer" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665244 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} err="failed to get container status \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": rpc error: code = NotFound desc = could not find container \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": container with ID starting with f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665265 4849 scope.go:117] "RemoveContainer" containerID="5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665636 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f"} err="failed to get container status \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": rpc error: code = NotFound desc = could not find container \"5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f\": container with ID starting with 5e1eda85e7703a63fdcb3ce0fbd82df69281b7eccaa9ea5e45044d5fa24f8d3f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665659 4849 scope.go:117] "RemoveContainer" containerID="d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665899 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f"} err="failed to get container status \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": rpc error: code = NotFound desc = could not find container \"d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f\": container with ID starting with d7759c438e26816cbfe64994d10fe608cbbf13350c6ee2eb53003d16a9f0eb8f not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.665920 4849 scope.go:117] "RemoveContainer" containerID="96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666097 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a"} err="failed to get container status \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": rpc error: code = NotFound desc = could not find container \"96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a\": container with ID starting with 96550f4518d68c1b243a6b7405291852d1938b980a5fea7a8c829440c09f233a not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666116 4849 scope.go:117] "RemoveContainer" containerID="25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666282 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef"} err="failed to get container status \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": rpc error: code = NotFound desc = could not find container \"25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef\": container with ID starting with 25f0cc7dc8a5f8723c2f473250cf00eda7cbea0d591ae6fad67e92a7108182ef not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666302 4849 scope.go:117] "RemoveContainer" containerID="4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666456 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b"} err="failed to get container status \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": rpc error: code = NotFound desc = could not find container \"4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b\": container with ID starting with 4f0a130bca76d043db958258f6379c2d228aeaba816a216809cf371133e1ec8b not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666476 4849 scope.go:117] "RemoveContainer" containerID="55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666661 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee"} err="failed to get container status \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": rpc error: code = NotFound desc = could not find container \"55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee\": container with ID starting with 55594daf3836205d985cf64201a90b71b1282ddb91ed2c1e4b4dd058e249c0ee not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666682 4849 scope.go:117] "RemoveContainer" containerID="a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666889 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19"} err="failed to get container status \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": rpc error: code = NotFound desc = could not find container \"a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19\": container with ID starting with a1298e6dfb442ed3021a757d5d5d4419a45b3fc8405cf48e0045a78026345f19 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.666909 4849 scope.go:117] "RemoveContainer" containerID="d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.667068 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076"} err="failed to get container status \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": rpc error: code = NotFound desc = could not find container \"d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076\": container with ID starting with d8f42510975f6af5f7500381b5d978cd15d44fdf962742b7d74eeb929152a076 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.667089 4849 scope.go:117] "RemoveContainer" containerID="13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.667246 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6"} err="failed to get container status \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": rpc error: code = NotFound desc = could not find container \"13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6\": container with ID starting with 13d8b0d994a682fa815066ddde85bc94317268c4122d9ccaa21454d858c4a3f6 not found: ID does not exist" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.667265 4849 scope.go:117] "RemoveContainer" containerID="f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b" Mar 20 13:37:16 crc kubenswrapper[4849]: I0320 13:37:16.667406 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b"} err="failed to get container status \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": rpc error: code = NotFound desc = could not find container \"f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b\": container with ID starting with f68787a872efefb1b6646d399934a20c734dd2b8030786a14324acf278b0a96b not found: ID does not exist" Mar 20 13:37:17 crc kubenswrapper[4849]: I0320 13:37:17.049226 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba9a25c-6156-4c78-a394-60507829eced" path="/var/lib/kubelet/pods/0ba9a25c-6156-4c78-a394-60507829eced/volumes" Mar 20 13:37:17 crc kubenswrapper[4849]: I0320 13:37:17.416409 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/2.log" Mar 20 13:37:17 crc kubenswrapper[4849]: I0320 13:37:17.419770 4849 generic.go:334] "Generic (PLEG): container finished" podID="d60ceeff-edd9-490a-86af-a630d6a8c702" containerID="db7579adf3588996603f84d802ac87125ee0e2601ebe213695d4941611902648" exitCode=0 Mar 20 13:37:17 crc kubenswrapper[4849]: I0320 13:37:17.419842 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerDied","Data":"db7579adf3588996603f84d802ac87125ee0e2601ebe213695d4941611902648"} Mar 20 13:37:17 crc kubenswrapper[4849]: I0320 13:37:17.419877 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"4429886ba4ed8637e63d4e3cd8ec55ccac5b381764c4f1083e7ca9bd4f48b58a"} Mar 20 13:37:18 crc kubenswrapper[4849]: I0320 13:37:18.429429 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"f49602aedd8de915f09d5984cf06327d775226378cd96131cc1f52b2edc0fa4f"} Mar 20 13:37:18 crc kubenswrapper[4849]: I0320 13:37:18.429655 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"cb89e8dea02adacc8911f0b265ab0c7588c2c63caccd853b48fc39006cbfa0b3"} Mar 20 13:37:18 crc kubenswrapper[4849]: I0320 13:37:18.429665 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"f7352ca415a9dcccc394e1650283f257caf06ec495d2cd6e16fdce7a894889a5"} Mar 20 13:37:18 crc kubenswrapper[4849]: I0320 13:37:18.429674 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"6d5829cd60d1c466139acdc1dd9b3dd0a62231cb5103aa49aa8dd4c5d236e4bb"} Mar 20 13:37:18 crc kubenswrapper[4849]: I0320 13:37:18.429683 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"c757ea6c9304526d07c218016c248c3ff671302907fa790ee40f4dd9325a0dad"} Mar 20 13:37:18 crc kubenswrapper[4849]: I0320 13:37:18.429692 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"aa4f08a26fd73007e70648242d1e41555483f5be1534b4b0462435ad57eef81d"} Mar 20 13:37:21 crc kubenswrapper[4849]: I0320 13:37:21.459613 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"4e6f754cda7647f31e8a031f519cff487eae621ff842e0e7590847d2d9e2db04"} Mar 20 13:37:23 crc kubenswrapper[4849]: I0320 13:37:23.473436 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" event={"ID":"d60ceeff-edd9-490a-86af-a630d6a8c702","Type":"ContainerStarted","Data":"9917a5ff698f1399c1e918e4acb071b4f92d3c2300b44e269a15afe213b9dc09"} Mar 20 13:37:23 crc kubenswrapper[4849]: I0320 13:37:23.473961 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:23 crc kubenswrapper[4849]: I0320 13:37:23.473972 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:23 crc kubenswrapper[4849]: I0320 13:37:23.497742 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" podStartSLOduration=7.497727118 podStartE2EDuration="7.497727118s" podCreationTimestamp="2026-03-20 13:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:23.495578839 +0000 UTC m=+793.173302264" watchObservedRunningTime="2026-03-20 13:37:23.497727118 +0000 UTC m=+793.175450513" Mar 20 13:37:23 crc kubenswrapper[4849]: I0320 13:37:23.502339 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:24 crc kubenswrapper[4849]: I0320 13:37:24.477894 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:24 crc kubenswrapper[4849]: I0320 13:37:24.499891 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:30 crc kubenswrapper[4849]: I0320 13:37:30.035881 4849 scope.go:117] "RemoveContainer" containerID="69558596bddd811517bc5bd607ff8fa66fc36eff63c8acc05cc3b9bc094b4472" Mar 20 13:37:30 crc kubenswrapper[4849]: I0320 13:37:30.518047 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7nxh7_606dc5eb-f89f-41cb-8aa2-f55fcab8f04d/kube-multus/2.log" Mar 20 13:37:30 crc kubenswrapper[4849]: I0320 13:37:30.518553 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7nxh7" event={"ID":"606dc5eb-f89f-41cb-8aa2-f55fcab8f04d","Type":"ContainerStarted","Data":"adb69eac15117fb41eb895de07ea61c7b1952883c31867ce251fc15b5de598fb"} Mar 20 13:37:39 crc kubenswrapper[4849]: I0320 13:37:39.384778 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:37:39 crc kubenswrapper[4849]: I0320 13:37:39.385393 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:37:46 crc kubenswrapper[4849]: I0320 13:37:46.466168 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm6d5" Mar 20 13:37:51 crc kubenswrapper[4849]: I0320 13:37:51.158837 4849 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:37:53 crc kubenswrapper[4849]: I0320 13:37:53.815551 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg"] Mar 20 13:37:53 crc kubenswrapper[4849]: I0320 13:37:53.816889 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:53 crc kubenswrapper[4849]: I0320 13:37:53.819196 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:37:53 crc kubenswrapper[4849]: I0320 13:37:53.832067 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg"] Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.009638 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.009808 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.009888 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27ntl\" (UniqueName: \"kubernetes.io/projected/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-kube-api-access-27ntl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.110783 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.111128 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.111199 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27ntl\" (UniqueName: \"kubernetes.io/projected/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-kube-api-access-27ntl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.111603 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.111622 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.131797 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27ntl\" (UniqueName: \"kubernetes.io/projected/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-kube-api-access-27ntl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.137370 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.387358 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg"] Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.818844 4849 generic.go:334] "Generic (PLEG): container finished" podID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerID="4fb4e18c7c5ef153de6b0d29e86dcb2f03942d3e89139e7d632e321a8fc9b097" exitCode=0 Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.819059 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" event={"ID":"b39cc5fb-85b5-407a-b4ca-7b674ae7039d","Type":"ContainerDied","Data":"4fb4e18c7c5ef153de6b0d29e86dcb2f03942d3e89139e7d632e321a8fc9b097"} Mar 20 13:37:54 crc kubenswrapper[4849]: I0320 13:37:54.819176 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" event={"ID":"b39cc5fb-85b5-407a-b4ca-7b674ae7039d","Type":"ContainerStarted","Data":"82ae20f969a02cb3c52ff6c8eb4a8d0ef972c845cf10efead391ae5e1fcb6dfd"} Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.034585 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c5xn9"] Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.036549 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.050326 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c5xn9"] Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.140247 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-catalog-content\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.140594 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts4lm\" (UniqueName: \"kubernetes.io/projected/28a0d812-045a-4785-8e93-d568705c3846-kube-api-access-ts4lm\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.140667 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-utilities\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.241948 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-catalog-content\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.242001 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts4lm\" (UniqueName: \"kubernetes.io/projected/28a0d812-045a-4785-8e93-d568705c3846-kube-api-access-ts4lm\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.242036 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-utilities\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.242576 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-utilities\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.242701 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-catalog-content\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.260864 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts4lm\" (UniqueName: \"kubernetes.io/projected/28a0d812-045a-4785-8e93-d568705c3846-kube-api-access-ts4lm\") pod \"redhat-operators-c5xn9\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.370944 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.833462 4849 generic.go:334] "Generic (PLEG): container finished" podID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerID="58ca32cca323c23061ef518dc88ff13430cc0968603adf03745064f6093fc338" exitCode=0 Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.833515 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" event={"ID":"b39cc5fb-85b5-407a-b4ca-7b674ae7039d","Type":"ContainerDied","Data":"58ca32cca323c23061ef518dc88ff13430cc0968603adf03745064f6093fc338"} Mar 20 13:37:56 crc kubenswrapper[4849]: I0320 13:37:56.903029 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c5xn9"] Mar 20 13:37:56 crc kubenswrapper[4849]: W0320 13:37:56.918531 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a0d812_045a_4785_8e93_d568705c3846.slice/crio-827223535ff2d6318f3ce84b8ec71216b9722896d1d9b697f352bd3a693e30ca WatchSource:0}: Error finding container 827223535ff2d6318f3ce84b8ec71216b9722896d1d9b697f352bd3a693e30ca: Status 404 returned error can't find the container with id 827223535ff2d6318f3ce84b8ec71216b9722896d1d9b697f352bd3a693e30ca Mar 20 13:37:57 crc kubenswrapper[4849]: I0320 13:37:57.844026 4849 generic.go:334] "Generic (PLEG): container finished" podID="28a0d812-045a-4785-8e93-d568705c3846" containerID="01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116" exitCode=0 Mar 20 13:37:57 crc kubenswrapper[4849]: I0320 13:37:57.844280 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerDied","Data":"01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116"} Mar 20 13:37:57 crc kubenswrapper[4849]: I0320 13:37:57.844386 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerStarted","Data":"827223535ff2d6318f3ce84b8ec71216b9722896d1d9b697f352bd3a693e30ca"} Mar 20 13:37:57 crc kubenswrapper[4849]: I0320 13:37:57.848425 4849 generic.go:334] "Generic (PLEG): container finished" podID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerID="5d3d6499432aa0d607c393e49327c393c8895576aa0c1d252f9a5b8094446ffd" exitCode=0 Mar 20 13:37:57 crc kubenswrapper[4849]: I0320 13:37:57.848501 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" event={"ID":"b39cc5fb-85b5-407a-b4ca-7b674ae7039d","Type":"ContainerDied","Data":"5d3d6499432aa0d607c393e49327c393c8895576aa0c1d252f9a5b8094446ffd"} Mar 20 13:37:58 crc kubenswrapper[4849]: I0320 13:37:58.855463 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerStarted","Data":"a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597"} Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.094976 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.281179 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-bundle\") pod \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.281256 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-util\") pod \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.281290 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27ntl\" (UniqueName: \"kubernetes.io/projected/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-kube-api-access-27ntl\") pod \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\" (UID: \"b39cc5fb-85b5-407a-b4ca-7b674ae7039d\") " Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.283326 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-bundle" (OuterVolumeSpecName: "bundle") pod "b39cc5fb-85b5-407a-b4ca-7b674ae7039d" (UID: "b39cc5fb-85b5-407a-b4ca-7b674ae7039d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.288196 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-kube-api-access-27ntl" (OuterVolumeSpecName: "kube-api-access-27ntl") pod "b39cc5fb-85b5-407a-b4ca-7b674ae7039d" (UID: "b39cc5fb-85b5-407a-b4ca-7b674ae7039d"). InnerVolumeSpecName "kube-api-access-27ntl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.315990 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-util" (OuterVolumeSpecName: "util") pod "b39cc5fb-85b5-407a-b4ca-7b674ae7039d" (UID: "b39cc5fb-85b5-407a-b4ca-7b674ae7039d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.382733 4849 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.382764 4849 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.382773 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27ntl\" (UniqueName: \"kubernetes.io/projected/b39cc5fb-85b5-407a-b4ca-7b674ae7039d-kube-api-access-27ntl\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.874990 4849 generic.go:334] "Generic (PLEG): container finished" podID="28a0d812-045a-4785-8e93-d568705c3846" containerID="a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597" exitCode=0 Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.875086 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerDied","Data":"a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597"} Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.880283 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" event={"ID":"b39cc5fb-85b5-407a-b4ca-7b674ae7039d","Type":"ContainerDied","Data":"82ae20f969a02cb3c52ff6c8eb4a8d0ef972c845cf10efead391ae5e1fcb6dfd"} Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.880341 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ae20f969a02cb3c52ff6c8eb4a8d0ef972c845cf10efead391ae5e1fcb6dfd" Mar 20 13:37:59 crc kubenswrapper[4849]: I0320 13:37:59.880407 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.148279 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566898-2zntj"] Mar 20 13:38:00 crc kubenswrapper[4849]: E0320 13:38:00.148471 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="extract" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.148481 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="extract" Mar 20 13:38:00 crc kubenswrapper[4849]: E0320 13:38:00.148491 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="pull" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.148498 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="pull" Mar 20 13:38:00 crc kubenswrapper[4849]: E0320 13:38:00.148512 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="util" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.148518 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="util" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.148603 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39cc5fb-85b5-407a-b4ca-7b674ae7039d" containerName="extract" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.148958 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.151535 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.151610 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.152278 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.160362 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-2zntj"] Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.292451 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xsll\" (UniqueName: \"kubernetes.io/projected/72c655a1-070d-4965-9949-6b3080d99104-kube-api-access-4xsll\") pod \"auto-csr-approver-29566898-2zntj\" (UID: \"72c655a1-070d-4965-9949-6b3080d99104\") " pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.393177 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xsll\" (UniqueName: \"kubernetes.io/projected/72c655a1-070d-4965-9949-6b3080d99104-kube-api-access-4xsll\") pod \"auto-csr-approver-29566898-2zntj\" (UID: \"72c655a1-070d-4965-9949-6b3080d99104\") " pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.415732 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xsll\" (UniqueName: \"kubernetes.io/projected/72c655a1-070d-4965-9949-6b3080d99104-kube-api-access-4xsll\") pod \"auto-csr-approver-29566898-2zntj\" (UID: \"72c655a1-070d-4965-9949-6b3080d99104\") " pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.476861 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.719890 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-2zntj"] Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.887338 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerStarted","Data":"fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39"} Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.889333 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-2zntj" event={"ID":"72c655a1-070d-4965-9949-6b3080d99104","Type":"ContainerStarted","Data":"ac321494ca7d0e3cad98fea895d668b1a0fb1fc9f76eb429b19d6b64aef9bb9e"} Mar 20 13:38:00 crc kubenswrapper[4849]: I0320 13:38:00.910784 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c5xn9" podStartSLOduration=2.426284727 podStartE2EDuration="4.91076677s" podCreationTimestamp="2026-03-20 13:37:56 +0000 UTC" firstStartedPulling="2026-03-20 13:37:57.847204805 +0000 UTC m=+827.524928240" lastFinishedPulling="2026-03-20 13:38:00.331686868 +0000 UTC m=+830.009410283" observedRunningTime="2026-03-20 13:38:00.907204422 +0000 UTC m=+830.584927817" watchObservedRunningTime="2026-03-20 13:38:00.91076677 +0000 UTC m=+830.588490165" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.336884 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f45bq"] Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.337541 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.339335 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.339459 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-h4rts" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.339719 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.347180 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f45bq"] Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.506446 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpv6n\" (UniqueName: \"kubernetes.io/projected/08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db-kube-api-access-wpv6n\") pod \"nmstate-operator-796d4cfff4-f45bq\" (UID: \"08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.607412 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpv6n\" (UniqueName: \"kubernetes.io/projected/08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db-kube-api-access-wpv6n\") pod \"nmstate-operator-796d4cfff4-f45bq\" (UID: \"08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.628065 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpv6n\" (UniqueName: \"kubernetes.io/projected/08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db-kube-api-access-wpv6n\") pod \"nmstate-operator-796d4cfff4-f45bq\" (UID: \"08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.655172 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.885301 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f45bq"] Mar 20 13:38:01 crc kubenswrapper[4849]: W0320 13:38:01.894277 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e1d2e6_28d6_4dcf_bf0f_ea6c92abc7db.slice/crio-3bc13485518528f75e39f92f010b7812f1d52728cda1b6f254d3a7ed518e167c WatchSource:0}: Error finding container 3bc13485518528f75e39f92f010b7812f1d52728cda1b6f254d3a7ed518e167c: Status 404 returned error can't find the container with id 3bc13485518528f75e39f92f010b7812f1d52728cda1b6f254d3a7ed518e167c Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.910654 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-2zntj" event={"ID":"72c655a1-070d-4965-9949-6b3080d99104","Type":"ContainerStarted","Data":"5184e57a6bad63b6b1b3dc7a9fcadb96b54de3bf36fd9b712578daac179fb823"} Mar 20 13:38:01 crc kubenswrapper[4849]: I0320 13:38:01.930951 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566898-2zntj" podStartSLOduration=1.088607083 podStartE2EDuration="1.930934114s" podCreationTimestamp="2026-03-20 13:38:00 +0000 UTC" firstStartedPulling="2026-03-20 13:38:00.734050287 +0000 UTC m=+830.411773682" lastFinishedPulling="2026-03-20 13:38:01.576377318 +0000 UTC m=+831.254100713" observedRunningTime="2026-03-20 13:38:01.929928056 +0000 UTC m=+831.607651481" watchObservedRunningTime="2026-03-20 13:38:01.930934114 +0000 UTC m=+831.608657509" Mar 20 13:38:02 crc kubenswrapper[4849]: I0320 13:38:02.917183 4849 generic.go:334] "Generic (PLEG): container finished" podID="72c655a1-070d-4965-9949-6b3080d99104" containerID="5184e57a6bad63b6b1b3dc7a9fcadb96b54de3bf36fd9b712578daac179fb823" exitCode=0 Mar 20 13:38:02 crc kubenswrapper[4849]: I0320 13:38:02.917253 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-2zntj" event={"ID":"72c655a1-070d-4965-9949-6b3080d99104","Type":"ContainerDied","Data":"5184e57a6bad63b6b1b3dc7a9fcadb96b54de3bf36fd9b712578daac179fb823"} Mar 20 13:38:02 crc kubenswrapper[4849]: I0320 13:38:02.919008 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" event={"ID":"08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db","Type":"ContainerStarted","Data":"3bc13485518528f75e39f92f010b7812f1d52728cda1b6f254d3a7ed518e167c"} Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.135481 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.245671 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xsll\" (UniqueName: \"kubernetes.io/projected/72c655a1-070d-4965-9949-6b3080d99104-kube-api-access-4xsll\") pod \"72c655a1-070d-4965-9949-6b3080d99104\" (UID: \"72c655a1-070d-4965-9949-6b3080d99104\") " Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.257696 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c655a1-070d-4965-9949-6b3080d99104-kube-api-access-4xsll" (OuterVolumeSpecName: "kube-api-access-4xsll") pod "72c655a1-070d-4965-9949-6b3080d99104" (UID: "72c655a1-070d-4965-9949-6b3080d99104"). InnerVolumeSpecName "kube-api-access-4xsll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.347030 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xsll\" (UniqueName: \"kubernetes.io/projected/72c655a1-070d-4965-9949-6b3080d99104-kube-api-access-4xsll\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.930046 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" event={"ID":"08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db","Type":"ContainerStarted","Data":"f4b9c07793f4a9f16b75edd70ba019e5418454047e372cc4b54090a93399c330"} Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.931618 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-2zntj" event={"ID":"72c655a1-070d-4965-9949-6b3080d99104","Type":"ContainerDied","Data":"ac321494ca7d0e3cad98fea895d668b1a0fb1fc9f76eb429b19d6b64aef9bb9e"} Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.931651 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac321494ca7d0e3cad98fea895d668b1a0fb1fc9f76eb429b19d6b64aef9bb9e" Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.931649 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-2zntj" Mar 20 13:38:04 crc kubenswrapper[4849]: I0320 13:38:04.960984 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f45bq" podStartSLOduration=1.9881710259999998 podStartE2EDuration="3.960962238s" podCreationTimestamp="2026-03-20 13:38:01 +0000 UTC" firstStartedPulling="2026-03-20 13:38:01.896628582 +0000 UTC m=+831.574351977" lastFinishedPulling="2026-03-20 13:38:03.869419774 +0000 UTC m=+833.547143189" observedRunningTime="2026-03-20 13:38:04.957009089 +0000 UTC m=+834.634732484" watchObservedRunningTime="2026-03-20 13:38:04.960962238 +0000 UTC m=+834.638685643" Mar 20 13:38:05 crc kubenswrapper[4849]: I0320 13:38:05.000477 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-n2d8k"] Mar 20 13:38:05 crc kubenswrapper[4849]: I0320 13:38:05.003926 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-n2d8k"] Mar 20 13:38:05 crc kubenswrapper[4849]: I0320 13:38:05.042041 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b" path="/var/lib/kubelet/pods/258d5ef3-c8b4-4b41-bd1f-8c742d2edd9b/volumes" Mar 20 13:38:06 crc kubenswrapper[4849]: I0320 13:38:06.371529 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:38:06 crc kubenswrapper[4849]: I0320 13:38:06.371857 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:38:07 crc kubenswrapper[4849]: I0320 13:38:07.412996 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c5xn9" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="registry-server" probeResult="failure" output=< Mar 20 13:38:07 crc kubenswrapper[4849]: timeout: failed to connect service ":50051" within 1s Mar 20 13:38:07 crc kubenswrapper[4849]: > Mar 20 13:38:09 crc kubenswrapper[4849]: I0320 13:38:09.385351 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:09 crc kubenswrapper[4849]: I0320 13:38:09.385460 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:09 crc kubenswrapper[4849]: I0320 13:38:09.385539 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:38:09 crc kubenswrapper[4849]: I0320 13:38:09.386717 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b38365e3077d108f503fa5f04333e6db13420f95bbb3b1017d115a7dd6908444"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:38:09 crc kubenswrapper[4849]: I0320 13:38:09.386878 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://b38365e3077d108f503fa5f04333e6db13420f95bbb3b1017d115a7dd6908444" gracePeriod=600 Mar 20 13:38:10 crc kubenswrapper[4849]: I0320 13:38:10.979612 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="b38365e3077d108f503fa5f04333e6db13420f95bbb3b1017d115a7dd6908444" exitCode=0 Mar 20 13:38:10 crc kubenswrapper[4849]: I0320 13:38:10.979696 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"b38365e3077d108f503fa5f04333e6db13420f95bbb3b1017d115a7dd6908444"} Mar 20 13:38:10 crc kubenswrapper[4849]: I0320 13:38:10.980372 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"796d63258641a3af91f7958992403b9a5ad9b68fcc83db460f8e4cc151f123e0"} Mar 20 13:38:10 crc kubenswrapper[4849]: I0320 13:38:10.980398 4849 scope.go:117] "RemoveContainer" containerID="130aa48b337a88e578102daf38d6fb66cf9cae0791d30e61767b78bd10649ad0" Mar 20 13:38:16 crc kubenswrapper[4849]: I0320 13:38:16.434762 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:38:16 crc kubenswrapper[4849]: I0320 13:38:16.491299 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:38:16 crc kubenswrapper[4849]: I0320 13:38:16.671042 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c5xn9"] Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.033634 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c5xn9" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="registry-server" containerID="cri-o://fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39" gracePeriod=2 Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.461071 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.656111 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-catalog-content\") pod \"28a0d812-045a-4785-8e93-d568705c3846\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.656191 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-utilities\") pod \"28a0d812-045a-4785-8e93-d568705c3846\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.656327 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts4lm\" (UniqueName: \"kubernetes.io/projected/28a0d812-045a-4785-8e93-d568705c3846-kube-api-access-ts4lm\") pod \"28a0d812-045a-4785-8e93-d568705c3846\" (UID: \"28a0d812-045a-4785-8e93-d568705c3846\") " Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.657925 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-utilities" (OuterVolumeSpecName: "utilities") pod "28a0d812-045a-4785-8e93-d568705c3846" (UID: "28a0d812-045a-4785-8e93-d568705c3846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.662867 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a0d812-045a-4785-8e93-d568705c3846-kube-api-access-ts4lm" (OuterVolumeSpecName: "kube-api-access-ts4lm") pod "28a0d812-045a-4785-8e93-d568705c3846" (UID: "28a0d812-045a-4785-8e93-d568705c3846"). InnerVolumeSpecName "kube-api-access-ts4lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.759154 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.759213 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts4lm\" (UniqueName: \"kubernetes.io/projected/28a0d812-045a-4785-8e93-d568705c3846-kube-api-access-ts4lm\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.860087 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28a0d812-045a-4785-8e93-d568705c3846" (UID: "28a0d812-045a-4785-8e93-d568705c3846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:18 crc kubenswrapper[4849]: I0320 13:38:18.860421 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28a0d812-045a-4785-8e93-d568705c3846-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.042343 4849 generic.go:334] "Generic (PLEG): container finished" podID="28a0d812-045a-4785-8e93-d568705c3846" containerID="fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39" exitCode=0 Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.042438 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c5xn9" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.044404 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerDied","Data":"fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39"} Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.044478 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c5xn9" event={"ID":"28a0d812-045a-4785-8e93-d568705c3846","Type":"ContainerDied","Data":"827223535ff2d6318f3ce84b8ec71216b9722896d1d9b697f352bd3a693e30ca"} Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.044517 4849 scope.go:117] "RemoveContainer" containerID="fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.066139 4849 scope.go:117] "RemoveContainer" containerID="a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.083792 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c5xn9"] Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.088260 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c5xn9"] Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.092278 4849 scope.go:117] "RemoveContainer" containerID="01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.114488 4849 scope.go:117] "RemoveContainer" containerID="fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39" Mar 20 13:38:19 crc kubenswrapper[4849]: E0320 13:38:19.114897 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39\": container with ID starting with fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39 not found: ID does not exist" containerID="fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.114959 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39"} err="failed to get container status \"fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39\": rpc error: code = NotFound desc = could not find container \"fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39\": container with ID starting with fd1ed161f71b6bcf2a932cc75f382bb3aff8509c3fc7cfdff3be7f11879e9c39 not found: ID does not exist" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.114988 4849 scope.go:117] "RemoveContainer" containerID="a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597" Mar 20 13:38:19 crc kubenswrapper[4849]: E0320 13:38:19.115283 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597\": container with ID starting with a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597 not found: ID does not exist" containerID="a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.115393 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597"} err="failed to get container status \"a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597\": rpc error: code = NotFound desc = could not find container \"a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597\": container with ID starting with a16fdd6793d47c14ac2a7a54024f9e2dd8c787f7d94d2456d9448a2fb5f00597 not found: ID does not exist" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.115466 4849 scope.go:117] "RemoveContainer" containerID="01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116" Mar 20 13:38:19 crc kubenswrapper[4849]: E0320 13:38:19.115793 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116\": container with ID starting with 01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116 not found: ID does not exist" containerID="01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116" Mar 20 13:38:19 crc kubenswrapper[4849]: I0320 13:38:19.115842 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116"} err="failed to get container status \"01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116\": rpc error: code = NotFound desc = could not find container \"01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116\": container with ID starting with 01d46f89f9c8e9b487aa27e7738d4d3847588923073e737647fab2a473285116 not found: ID does not exist" Mar 20 13:38:21 crc kubenswrapper[4849]: I0320 13:38:21.044023 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a0d812-045a-4785-8e93-d568705c3846" path="/var/lib/kubelet/pods/28a0d812-045a-4785-8e93-d568705c3846/volumes" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.139965 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr"] Mar 20 13:38:27 crc kubenswrapper[4849]: E0320 13:38:27.140490 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="extract-content" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.140506 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="extract-content" Mar 20 13:38:27 crc kubenswrapper[4849]: E0320 13:38:27.140526 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c655a1-070d-4965-9949-6b3080d99104" containerName="oc" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.140534 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c655a1-070d-4965-9949-6b3080d99104" containerName="oc" Mar 20 13:38:27 crc kubenswrapper[4849]: E0320 13:38:27.140544 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="extract-utilities" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.140553 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="extract-utilities" Mar 20 13:38:27 crc kubenswrapper[4849]: E0320 13:38:27.140566 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="registry-server" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.140575 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="registry-server" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.140693 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c655a1-070d-4965-9949-6b3080d99104" containerName="oc" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.140710 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="28a0d812-045a-4785-8e93-d568705c3846" containerName="registry-server" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.141359 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.143593 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-v2dfg" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.144811 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.145632 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.147139 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.162748 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.171554 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-p5xfw"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.172220 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.182238 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278020 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnq7j\" (UniqueName: \"kubernetes.io/projected/b059c698-6411-477d-b7de-3da2b096a013-kube-api-access-wnq7j\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278067 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh7zg\" (UniqueName: \"kubernetes.io/projected/8dad839f-5780-4ace-a89d-b4e79b51f2be-kube-api-access-jh7zg\") pod \"nmstate-webhook-5f558f5558-jnp5v\" (UID: \"8dad839f-5780-4ace-a89d-b4e79b51f2be\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278100 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjrs\" (UniqueName: \"kubernetes.io/projected/2c219def-f597-4e67-87d7-61844f8980e0-kube-api-access-6bjrs\") pod \"nmstate-metrics-9b8c8685d-h6gpr\" (UID: \"2c219def-f597-4e67-87d7-61844f8980e0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-nmstate-lock\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278145 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-ovs-socket\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278205 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8dad839f-5780-4ace-a89d-b4e79b51f2be-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jnp5v\" (UID: \"8dad839f-5780-4ace-a89d-b4e79b51f2be\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.278222 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-dbus-socket\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.297691 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.298539 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.300880 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.300895 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-gkskx" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.303055 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.306133 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379181 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjrs\" (UniqueName: \"kubernetes.io/projected/2c219def-f597-4e67-87d7-61844f8980e0-kube-api-access-6bjrs\") pod \"nmstate-metrics-9b8c8685d-h6gpr\" (UID: \"2c219def-f597-4e67-87d7-61844f8980e0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379239 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-nmstate-lock\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379280 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-nmstate-lock\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379553 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-ovs-socket\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379586 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-ovs-socket\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379609 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8dad839f-5780-4ace-a89d-b4e79b51f2be-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jnp5v\" (UID: \"8dad839f-5780-4ace-a89d-b4e79b51f2be\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379632 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-dbus-socket\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379660 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnq7j\" (UniqueName: \"kubernetes.io/projected/b059c698-6411-477d-b7de-3da2b096a013-kube-api-access-wnq7j\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.379692 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh7zg\" (UniqueName: \"kubernetes.io/projected/8dad839f-5780-4ace-a89d-b4e79b51f2be-kube-api-access-jh7zg\") pod \"nmstate-webhook-5f558f5558-jnp5v\" (UID: \"8dad839f-5780-4ace-a89d-b4e79b51f2be\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.380162 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/b059c698-6411-477d-b7de-3da2b096a013-dbus-socket\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.390568 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8dad839f-5780-4ace-a89d-b4e79b51f2be-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-jnp5v\" (UID: \"8dad839f-5780-4ace-a89d-b4e79b51f2be\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.398776 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh7zg\" (UniqueName: \"kubernetes.io/projected/8dad839f-5780-4ace-a89d-b4e79b51f2be-kube-api-access-jh7zg\") pod \"nmstate-webhook-5f558f5558-jnp5v\" (UID: \"8dad839f-5780-4ace-a89d-b4e79b51f2be\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.401325 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnq7j\" (UniqueName: \"kubernetes.io/projected/b059c698-6411-477d-b7de-3da2b096a013-kube-api-access-wnq7j\") pod \"nmstate-handler-p5xfw\" (UID: \"b059c698-6411-477d-b7de-3da2b096a013\") " pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.405719 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjrs\" (UniqueName: \"kubernetes.io/projected/2c219def-f597-4e67-87d7-61844f8980e0-kube-api-access-6bjrs\") pod \"nmstate-metrics-9b8c8685d-h6gpr\" (UID: \"2c219def-f597-4e67-87d7-61844f8980e0\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.462618 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.464231 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-764b7667b4-dcxzs"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.464893 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.475529 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764b7667b4-dcxzs"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.479168 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480407 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-oauth-serving-cert\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480516 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z55jc\" (UniqueName: \"kubernetes.io/projected/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-kube-api-access-z55jc\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480617 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/95c4282d-b3de-46dc-9806-c6cc81e3bde2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480691 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/95c4282d-b3de-46dc-9806-c6cc81e3bde2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480762 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-oauth-config\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480856 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx882\" (UniqueName: \"kubernetes.io/projected/95c4282d-b3de-46dc-9806-c6cc81e3bde2-kube-api-access-wx882\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.480961 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-service-ca\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.481043 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-trusted-ca-bundle\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.481152 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-serving-cert\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.481243 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-config\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.494425 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582238 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-serving-cert\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582276 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-config\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582296 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-oauth-serving-cert\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582314 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z55jc\" (UniqueName: \"kubernetes.io/projected/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-kube-api-access-z55jc\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582343 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/95c4282d-b3de-46dc-9806-c6cc81e3bde2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582364 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/95c4282d-b3de-46dc-9806-c6cc81e3bde2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582382 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-oauth-config\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582406 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx882\" (UniqueName: \"kubernetes.io/projected/95c4282d-b3de-46dc-9806-c6cc81e3bde2-kube-api-access-wx882\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582432 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-service-ca\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.582446 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-trusted-ca-bundle\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.583354 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-config\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.584100 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-oauth-serving-cert\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.584278 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-trusted-ca-bundle\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.586231 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/95c4282d-b3de-46dc-9806-c6cc81e3bde2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.586462 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-service-ca\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.600480 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-oauth-config\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.600918 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/95c4282d-b3de-46dc-9806-c6cc81e3bde2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.604267 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z55jc\" (UniqueName: \"kubernetes.io/projected/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-kube-api-access-z55jc\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.604314 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e2b53e-99d7-4a61-bf27-ab9e41568d0e-console-serving-cert\") pod \"console-764b7667b4-dcxzs\" (UID: \"40e2b53e-99d7-4a61-bf27-ab9e41568d0e\") " pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.606348 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx882\" (UniqueName: \"kubernetes.io/projected/95c4282d-b3de-46dc-9806-c6cc81e3bde2-kube-api-access-wx882\") pod \"nmstate-console-plugin-86f58fcf4-dlg5s\" (UID: \"95c4282d-b3de-46dc-9806-c6cc81e3bde2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.614440 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.718119 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v"] Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.788156 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s"] Mar 20 13:38:27 crc kubenswrapper[4849]: W0320 13:38:27.792456 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95c4282d_b3de_46dc_9806_c6cc81e3bde2.slice/crio-08ac56fcb11c23e3ebbd824606b2cdf72e72fe05645fb00b06a46d14b5132d05 WatchSource:0}: Error finding container 08ac56fcb11c23e3ebbd824606b2cdf72e72fe05645fb00b06a46d14b5132d05: Status 404 returned error can't find the container with id 08ac56fcb11c23e3ebbd824606b2cdf72e72fe05645fb00b06a46d14b5132d05 Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.831118 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:27 crc kubenswrapper[4849]: I0320 13:38:27.900073 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr"] Mar 20 13:38:27 crc kubenswrapper[4849]: W0320 13:38:27.909188 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c219def_f597_4e67_87d7_61844f8980e0.slice/crio-9b08142b70da3bf66306c7784edd40b487d25f7c23770485a400eb5a9b23bf13 WatchSource:0}: Error finding container 9b08142b70da3bf66306c7784edd40b487d25f7c23770485a400eb5a9b23bf13: Status 404 returned error can't find the container with id 9b08142b70da3bf66306c7784edd40b487d25f7c23770485a400eb5a9b23bf13 Mar 20 13:38:28 crc kubenswrapper[4849]: I0320 13:38:28.018607 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764b7667b4-dcxzs"] Mar 20 13:38:28 crc kubenswrapper[4849]: W0320 13:38:28.024519 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e2b53e_99d7_4a61_bf27_ab9e41568d0e.slice/crio-8a8973d2a5911fe0e363b4e6b1729518e89e4328e59978d662e1b59490ddf3a5 WatchSource:0}: Error finding container 8a8973d2a5911fe0e363b4e6b1729518e89e4328e59978d662e1b59490ddf3a5: Status 404 returned error can't find the container with id 8a8973d2a5911fe0e363b4e6b1729518e89e4328e59978d662e1b59490ddf3a5 Mar 20 13:38:28 crc kubenswrapper[4849]: I0320 13:38:28.092741 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" event={"ID":"8dad839f-5780-4ace-a89d-b4e79b51f2be","Type":"ContainerStarted","Data":"f0332016fda9c11736902a26aea742459e650b35bea330505cb89fff63091c2a"} Mar 20 13:38:28 crc kubenswrapper[4849]: I0320 13:38:28.093757 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" event={"ID":"95c4282d-b3de-46dc-9806-c6cc81e3bde2","Type":"ContainerStarted","Data":"08ac56fcb11c23e3ebbd824606b2cdf72e72fe05645fb00b06a46d14b5132d05"} Mar 20 13:38:28 crc kubenswrapper[4849]: I0320 13:38:28.094594 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" event={"ID":"2c219def-f597-4e67-87d7-61844f8980e0","Type":"ContainerStarted","Data":"9b08142b70da3bf66306c7784edd40b487d25f7c23770485a400eb5a9b23bf13"} Mar 20 13:38:28 crc kubenswrapper[4849]: I0320 13:38:28.095356 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764b7667b4-dcxzs" event={"ID":"40e2b53e-99d7-4a61-bf27-ab9e41568d0e","Type":"ContainerStarted","Data":"8a8973d2a5911fe0e363b4e6b1729518e89e4328e59978d662e1b59490ddf3a5"} Mar 20 13:38:28 crc kubenswrapper[4849]: I0320 13:38:28.096471 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p5xfw" event={"ID":"b059c698-6411-477d-b7de-3da2b096a013","Type":"ContainerStarted","Data":"4c6c052e89c7c191f439f7a915909b37a904b82875e0c41ba735b65de8fac1fd"} Mar 20 13:38:29 crc kubenswrapper[4849]: I0320 13:38:29.109907 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764b7667b4-dcxzs" event={"ID":"40e2b53e-99d7-4a61-bf27-ab9e41568d0e","Type":"ContainerStarted","Data":"291ff120a5238d4099215f16405f53ad789b038dd6c1f178e60f284aefe205fd"} Mar 20 13:38:29 crc kubenswrapper[4849]: I0320 13:38:29.140145 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764b7667b4-dcxzs" podStartSLOduration=2.140127299 podStartE2EDuration="2.140127299s" podCreationTimestamp="2026-03-20 13:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:38:29.13830199 +0000 UTC m=+858.816025405" watchObservedRunningTime="2026-03-20 13:38:29.140127299 +0000 UTC m=+858.817850694" Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.124250 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-p5xfw" event={"ID":"b059c698-6411-477d-b7de-3da2b096a013","Type":"ContainerStarted","Data":"cdd2b24d4645bb166cfd88db7b9d1fcd6db6428d347a0a74946a02832e64cd4e"} Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.124967 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.127796 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" event={"ID":"8dad839f-5780-4ace-a89d-b4e79b51f2be","Type":"ContainerStarted","Data":"8060640ee0447f6bcc9108891f33286f1957850a34707ac437716a23027c8b0d"} Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.128357 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.129740 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" event={"ID":"95c4282d-b3de-46dc-9806-c6cc81e3bde2","Type":"ContainerStarted","Data":"8b612d6ed570f6fb303b03d69d3e322820bd14cb6d1362ebebfd8abccef39303"} Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.131189 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" event={"ID":"2c219def-f597-4e67-87d7-61844f8980e0","Type":"ContainerStarted","Data":"1c3091c8debdf37fb149c5339be6ad5906a15fed6fbbf8110b47e14b680658ef"} Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.167835 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-p5xfw" podStartSLOduration=1.144216633 podStartE2EDuration="4.167797567s" podCreationTimestamp="2026-03-20 13:38:27 +0000 UTC" firstStartedPulling="2026-03-20 13:38:27.599173529 +0000 UTC m=+857.276896924" lastFinishedPulling="2026-03-20 13:38:30.622754463 +0000 UTC m=+860.300477858" observedRunningTime="2026-03-20 13:38:31.148336291 +0000 UTC m=+860.826059686" watchObservedRunningTime="2026-03-20 13:38:31.167797567 +0000 UTC m=+860.845520962" Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.169484 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-dlg5s" podStartSLOduration=1.349998783 podStartE2EDuration="4.169477183s" podCreationTimestamp="2026-03-20 13:38:27 +0000 UTC" firstStartedPulling="2026-03-20 13:38:27.794558018 +0000 UTC m=+857.472281413" lastFinishedPulling="2026-03-20 13:38:30.614036378 +0000 UTC m=+860.291759813" observedRunningTime="2026-03-20 13:38:31.164758125 +0000 UTC m=+860.842481530" watchObservedRunningTime="2026-03-20 13:38:31.169477183 +0000 UTC m=+860.847200578" Mar 20 13:38:31 crc kubenswrapper[4849]: I0320 13:38:31.187662 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" podStartSLOduration=1.289425595 podStartE2EDuration="4.187641583s" podCreationTimestamp="2026-03-20 13:38:27 +0000 UTC" firstStartedPulling="2026-03-20 13:38:27.724542206 +0000 UTC m=+857.402265601" lastFinishedPulling="2026-03-20 13:38:30.622758164 +0000 UTC m=+860.300481589" observedRunningTime="2026-03-20 13:38:31.181381704 +0000 UTC m=+860.859105099" watchObservedRunningTime="2026-03-20 13:38:31.187641583 +0000 UTC m=+860.865364978" Mar 20 13:38:32 crc kubenswrapper[4849]: I0320 13:38:32.861749 4849 scope.go:117] "RemoveContainer" containerID="926410ea79e7d0fe1269b3af88e4d6f3cd330020079ba9251447277e91d6a84e" Mar 20 13:38:34 crc kubenswrapper[4849]: I0320 13:38:34.152371 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" event={"ID":"2c219def-f597-4e67-87d7-61844f8980e0","Type":"ContainerStarted","Data":"eeb84ceffb774e46530b1dc215ce0a7fd3a486df2b66a5f903e30eda28022174"} Mar 20 13:38:34 crc kubenswrapper[4849]: I0320 13:38:34.171572 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-h6gpr" podStartSLOduration=1.636211466 podStartE2EDuration="7.171555786s" podCreationTimestamp="2026-03-20 13:38:27 +0000 UTC" firstStartedPulling="2026-03-20 13:38:27.913260985 +0000 UTC m=+857.590984380" lastFinishedPulling="2026-03-20 13:38:33.448605305 +0000 UTC m=+863.126328700" observedRunningTime="2026-03-20 13:38:34.171431223 +0000 UTC m=+863.849154648" watchObservedRunningTime="2026-03-20 13:38:34.171555786 +0000 UTC m=+863.849279181" Mar 20 13:38:37 crc kubenswrapper[4849]: I0320 13:38:37.518101 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-p5xfw" Mar 20 13:38:37 crc kubenswrapper[4849]: I0320 13:38:37.832178 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:37 crc kubenswrapper[4849]: I0320 13:38:37.832256 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:37 crc kubenswrapper[4849]: I0320 13:38:37.840934 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:38 crc kubenswrapper[4849]: I0320 13:38:38.182531 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-764b7667b4-dcxzs" Mar 20 13:38:38 crc kubenswrapper[4849]: I0320 13:38:38.243719 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ztzl5"] Mar 20 13:38:47 crc kubenswrapper[4849]: I0320 13:38:47.489174 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-jnp5v" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.195738 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2"] Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.197734 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.200137 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.216107 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2"] Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.313365 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.313421 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.313499 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v429c\" (UniqueName: \"kubernetes.io/projected/e50fa65a-4209-4a3e-8626-278c3920e206-kube-api-access-v429c\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.414639 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.414722 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.414876 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v429c\" (UniqueName: \"kubernetes.io/projected/e50fa65a-4209-4a3e-8626-278c3920e206-kube-api-access-v429c\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.415254 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.415626 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.445605 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v429c\" (UniqueName: \"kubernetes.io/projected/e50fa65a-4209-4a3e-8626-278c3920e206-kube-api-access-v429c\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.517279 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:38:59 crc kubenswrapper[4849]: I0320 13:38:59.741709 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2"] Mar 20 13:39:00 crc kubenswrapper[4849]: I0320 13:39:00.548511 4849 generic.go:334] "Generic (PLEG): container finished" podID="e50fa65a-4209-4a3e-8626-278c3920e206" containerID="9d1f501c9bf468819fe52b1449646e6dabe220b2bf67924f7a164551391ea64e" exitCode=0 Mar 20 13:39:00 crc kubenswrapper[4849]: I0320 13:39:00.548636 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" event={"ID":"e50fa65a-4209-4a3e-8626-278c3920e206","Type":"ContainerDied","Data":"9d1f501c9bf468819fe52b1449646e6dabe220b2bf67924f7a164551391ea64e"} Mar 20 13:39:00 crc kubenswrapper[4849]: I0320 13:39:00.548885 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" event={"ID":"e50fa65a-4209-4a3e-8626-278c3920e206","Type":"ContainerStarted","Data":"c6ccc56db0c0b66e364dd95225d457b1c5feb83690e3cb24ec3f9832b3f0220d"} Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.279408 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ztzl5" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerName="console" containerID="cri-o://1026adec484469cd5dd09bb7b2c530d14fe126bee7efc4949e17180971c4c608" gracePeriod=15 Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.576546 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ztzl5_200191b3-9ea4-4ed7-b4b1-05e8ce9d3537/console/0.log" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.576975 4849 generic.go:334] "Generic (PLEG): container finished" podID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerID="1026adec484469cd5dd09bb7b2c530d14fe126bee7efc4949e17180971c4c608" exitCode=2 Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.577067 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ztzl5" event={"ID":"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537","Type":"ContainerDied","Data":"1026adec484469cd5dd09bb7b2c530d14fe126bee7efc4949e17180971c4c608"} Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.579307 4849 generic.go:334] "Generic (PLEG): container finished" podID="e50fa65a-4209-4a3e-8626-278c3920e206" containerID="0f1f2ac61063a08b592d5e4f28b6837382f8b4a132fb89cc01f1fdae314be3ad" exitCode=0 Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.579342 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" event={"ID":"e50fa65a-4209-4a3e-8626-278c3920e206","Type":"ContainerDied","Data":"0f1f2ac61063a08b592d5e4f28b6837382f8b4a132fb89cc01f1fdae314be3ad"} Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.686510 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ztzl5_200191b3-9ea4-4ed7-b4b1-05e8ce9d3537/console/0.log" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.686566 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872342 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-trusted-ca-bundle\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872423 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-service-ca\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872453 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-oauth-config\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872483 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-config\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872509 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-oauth-serving-cert\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872550 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-serving-cert\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.872585 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbcd2\" (UniqueName: \"kubernetes.io/projected/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-kube-api-access-nbcd2\") pod \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\" (UID: \"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537\") " Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.873065 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.873077 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-service-ca" (OuterVolumeSpecName: "service-ca") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.873358 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.873792 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-config" (OuterVolumeSpecName: "console-config") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.877748 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-kube-api-access-nbcd2" (OuterVolumeSpecName: "kube-api-access-nbcd2") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "kube-api-access-nbcd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.877946 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.878839 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" (UID: "200191b3-9ea4-4ed7-b4b1-05e8ce9d3537"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973538 4849 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973570 4849 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973579 4849 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973603 4849 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973612 4849 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973619 4849 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:03 crc kubenswrapper[4849]: I0320 13:39:03.973627 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbcd2\" (UniqueName: \"kubernetes.io/projected/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537-kube-api-access-nbcd2\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.596256 4849 generic.go:334] "Generic (PLEG): container finished" podID="e50fa65a-4209-4a3e-8626-278c3920e206" containerID="55c53021289c90afac665f29475bd18b8f765b31fb98a8d9e25313c60a9e25a4" exitCode=0 Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.596361 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" event={"ID":"e50fa65a-4209-4a3e-8626-278c3920e206","Type":"ContainerDied","Data":"55c53021289c90afac665f29475bd18b8f765b31fb98a8d9e25313c60a9e25a4"} Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.600796 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ztzl5_200191b3-9ea4-4ed7-b4b1-05e8ce9d3537/console/0.log" Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.600931 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ztzl5" event={"ID":"200191b3-9ea4-4ed7-b4b1-05e8ce9d3537","Type":"ContainerDied","Data":"5da13af49286494d17e3f05668954aa773d596e0dc25776a32d0f00c8ad64a9f"} Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.600985 4849 scope.go:117] "RemoveContainer" containerID="1026adec484469cd5dd09bb7b2c530d14fe126bee7efc4949e17180971c4c608" Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.601005 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ztzl5" Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.648358 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ztzl5"] Mar 20 13:39:04 crc kubenswrapper[4849]: I0320 13:39:04.651305 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ztzl5"] Mar 20 13:39:05 crc kubenswrapper[4849]: I0320 13:39:05.043596 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" path="/var/lib/kubelet/pods/200191b3-9ea4-4ed7-b4b1-05e8ce9d3537/volumes" Mar 20 13:39:05 crc kubenswrapper[4849]: I0320 13:39:05.915911 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.100051 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v429c\" (UniqueName: \"kubernetes.io/projected/e50fa65a-4209-4a3e-8626-278c3920e206-kube-api-access-v429c\") pod \"e50fa65a-4209-4a3e-8626-278c3920e206\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.100316 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-bundle\") pod \"e50fa65a-4209-4a3e-8626-278c3920e206\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.100396 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-util\") pod \"e50fa65a-4209-4a3e-8626-278c3920e206\" (UID: \"e50fa65a-4209-4a3e-8626-278c3920e206\") " Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.103005 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-bundle" (OuterVolumeSpecName: "bundle") pod "e50fa65a-4209-4a3e-8626-278c3920e206" (UID: "e50fa65a-4209-4a3e-8626-278c3920e206"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.108013 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50fa65a-4209-4a3e-8626-278c3920e206-kube-api-access-v429c" (OuterVolumeSpecName: "kube-api-access-v429c") pod "e50fa65a-4209-4a3e-8626-278c3920e206" (UID: "e50fa65a-4209-4a3e-8626-278c3920e206"). InnerVolumeSpecName "kube-api-access-v429c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.115355 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-util" (OuterVolumeSpecName: "util") pod "e50fa65a-4209-4a3e-8626-278c3920e206" (UID: "e50fa65a-4209-4a3e-8626-278c3920e206"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.202218 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v429c\" (UniqueName: \"kubernetes.io/projected/e50fa65a-4209-4a3e-8626-278c3920e206-kube-api-access-v429c\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.202285 4849 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.202309 4849 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e50fa65a-4209-4a3e-8626-278c3920e206-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.620009 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" event={"ID":"e50fa65a-4209-4a3e-8626-278c3920e206","Type":"ContainerDied","Data":"c6ccc56db0c0b66e364dd95225d457b1c5feb83690e3cb24ec3f9832b3f0220d"} Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.620046 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6ccc56db0c0b66e364dd95225d457b1c5feb83690e3cb24ec3f9832b3f0220d" Mar 20 13:39:06 crc kubenswrapper[4849]: I0320 13:39:06.620132 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.134070 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp"] Mar 20 13:39:14 crc kubenswrapper[4849]: E0320 13:39:14.134858 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="util" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.134871 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="util" Mar 20 13:39:14 crc kubenswrapper[4849]: E0320 13:39:14.134882 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerName="console" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.134889 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerName="console" Mar 20 13:39:14 crc kubenswrapper[4849]: E0320 13:39:14.134903 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="extract" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.134910 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="extract" Mar 20 13:39:14 crc kubenswrapper[4849]: E0320 13:39:14.134919 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="pull" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.134925 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="pull" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.135018 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50fa65a-4209-4a3e-8626-278c3920e206" containerName="extract" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.135030 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="200191b3-9ea4-4ed7-b4b1-05e8ce9d3537" containerName="console" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.135390 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.136710 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.139848 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.140383 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.140597 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tn65r" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.142952 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.149541 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp"] Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.199839 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba371ab1-5402-4921-b341-f54033be1fca-apiservice-cert\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.199892 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggz47\" (UniqueName: \"kubernetes.io/projected/ba371ab1-5402-4921-b341-f54033be1fca-kube-api-access-ggz47\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.200034 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba371ab1-5402-4921-b341-f54033be1fca-webhook-cert\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.300930 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba371ab1-5402-4921-b341-f54033be1fca-apiservice-cert\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.300997 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggz47\" (UniqueName: \"kubernetes.io/projected/ba371ab1-5402-4921-b341-f54033be1fca-kube-api-access-ggz47\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.301071 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba371ab1-5402-4921-b341-f54033be1fca-webhook-cert\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.306451 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ba371ab1-5402-4921-b341-f54033be1fca-webhook-cert\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.306483 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ba371ab1-5402-4921-b341-f54033be1fca-apiservice-cert\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.322766 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggz47\" (UniqueName: \"kubernetes.io/projected/ba371ab1-5402-4921-b341-f54033be1fca-kube-api-access-ggz47\") pod \"metallb-operator-controller-manager-7fcdcd599c-wtkkp\" (UID: \"ba371ab1-5402-4921-b341-f54033be1fca\") " pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.452724 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.567592 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h"] Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.568287 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.574008 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.574156 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.574273 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-vjzwt" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.577379 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h"] Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.606747 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a336306-f1db-4c3d-ab2e-30c344b03195-apiservice-cert\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.606791 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a336306-f1db-4c3d-ab2e-30c344b03195-webhook-cert\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.606868 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgcxm\" (UniqueName: \"kubernetes.io/projected/5a336306-f1db-4c3d-ab2e-30c344b03195-kube-api-access-qgcxm\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.703485 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp"] Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.707624 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a336306-f1db-4c3d-ab2e-30c344b03195-apiservice-cert\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.707671 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a336306-f1db-4c3d-ab2e-30c344b03195-webhook-cert\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.708142 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgcxm\" (UniqueName: \"kubernetes.io/projected/5a336306-f1db-4c3d-ab2e-30c344b03195-kube-api-access-qgcxm\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.719674 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a336306-f1db-4c3d-ab2e-30c344b03195-webhook-cert\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.725456 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a336306-f1db-4c3d-ab2e-30c344b03195-apiservice-cert\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.726981 4849 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.734045 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgcxm\" (UniqueName: \"kubernetes.io/projected/5a336306-f1db-4c3d-ab2e-30c344b03195-kube-api-access-qgcxm\") pod \"metallb-operator-webhook-server-74dc9df6c8-4p87h\" (UID: \"5a336306-f1db-4c3d-ab2e-30c344b03195\") " pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:14 crc kubenswrapper[4849]: I0320 13:39:14.912219 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:15 crc kubenswrapper[4849]: I0320 13:39:15.309444 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h"] Mar 20 13:39:15 crc kubenswrapper[4849]: W0320 13:39:15.318990 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a336306_f1db_4c3d_ab2e_30c344b03195.slice/crio-f936192d900e49a27e03c1d5a78ce026b7ef07fd8fe15a1a688acaf479f8ff82 WatchSource:0}: Error finding container f936192d900e49a27e03c1d5a78ce026b7ef07fd8fe15a1a688acaf479f8ff82: Status 404 returned error can't find the container with id f936192d900e49a27e03c1d5a78ce026b7ef07fd8fe15a1a688acaf479f8ff82 Mar 20 13:39:15 crc kubenswrapper[4849]: I0320 13:39:15.669806 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" event={"ID":"5a336306-f1db-4c3d-ab2e-30c344b03195","Type":"ContainerStarted","Data":"f936192d900e49a27e03c1d5a78ce026b7ef07fd8fe15a1a688acaf479f8ff82"} Mar 20 13:39:15 crc kubenswrapper[4849]: I0320 13:39:15.671547 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" event={"ID":"ba371ab1-5402-4921-b341-f54033be1fca","Type":"ContainerStarted","Data":"de68f4722cb4a1c6d1bb317b3bda79000b74dd97937cf633c646afeb2f58296b"} Mar 20 13:39:20 crc kubenswrapper[4849]: I0320 13:39:20.709627 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" event={"ID":"5a336306-f1db-4c3d-ab2e-30c344b03195","Type":"ContainerStarted","Data":"be67909a7c60eaf7754da5e65e71462c3138c82970898c07cf5fa4791c23b6f2"} Mar 20 13:39:20 crc kubenswrapper[4849]: I0320 13:39:20.710185 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:20 crc kubenswrapper[4849]: I0320 13:39:20.711353 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" event={"ID":"ba371ab1-5402-4921-b341-f54033be1fca","Type":"ContainerStarted","Data":"fb3babd58708c11ed086b284c81434b7507536fd23b8f2be6a9d220a99c18649"} Mar 20 13:39:20 crc kubenswrapper[4849]: I0320 13:39:20.711499 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:20 crc kubenswrapper[4849]: I0320 13:39:20.734596 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" podStartSLOduration=2.274583707 podStartE2EDuration="6.734577446s" podCreationTimestamp="2026-03-20 13:39:14 +0000 UTC" firstStartedPulling="2026-03-20 13:39:15.321575344 +0000 UTC m=+904.999298739" lastFinishedPulling="2026-03-20 13:39:19.781569083 +0000 UTC m=+909.459292478" observedRunningTime="2026-03-20 13:39:20.72984258 +0000 UTC m=+910.407565985" watchObservedRunningTime="2026-03-20 13:39:20.734577446 +0000 UTC m=+910.412300841" Mar 20 13:39:20 crc kubenswrapper[4849]: I0320 13:39:20.752181 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" podStartSLOduration=1.7107303740000002 podStartE2EDuration="6.752165285s" podCreationTimestamp="2026-03-20 13:39:14 +0000 UTC" firstStartedPulling="2026-03-20 13:39:14.726672273 +0000 UTC m=+904.404395668" lastFinishedPulling="2026-03-20 13:39:19.768107184 +0000 UTC m=+909.445830579" observedRunningTime="2026-03-20 13:39:20.749999237 +0000 UTC m=+910.427722642" watchObservedRunningTime="2026-03-20 13:39:20.752165285 +0000 UTC m=+910.429888680" Mar 20 13:39:34 crc kubenswrapper[4849]: I0320 13:39:34.920941 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74dc9df6c8-4p87h" Mar 20 13:39:54 crc kubenswrapper[4849]: I0320 13:39:54.455668 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7fcdcd599c-wtkkp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.116569 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9k4zx"] Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.120008 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.121881 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-npsfq" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.121879 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.122340 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.134538 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz"] Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.135237 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.137272 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.153453 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz"] Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.217024 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zmzbp"] Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.218261 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.220482 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.220794 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-mggv4" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.220906 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.221377 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.236059 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-rxmwb"] Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.237156 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.241886 4849 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.249813 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-metrics\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.249886 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61f44068-2b56-4919-83ac-1e6c43aaf840-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kbdhz\" (UID: \"61f44068-2b56-4919-83ac-1e6c43aaf840\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.249911 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-frr-sockets\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.249935 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-reloader\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.249969 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82866517-1e14-49c5-81be-88da5e861369-metrics-certs\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.249994 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwnpc\" (UniqueName: \"kubernetes.io/projected/82866517-1e14-49c5-81be-88da5e861369-kube-api-access-wwnpc\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.250014 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82866517-1e14-49c5-81be-88da5e861369-frr-startup\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.250143 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g542s\" (UniqueName: \"kubernetes.io/projected/61f44068-2b56-4919-83ac-1e6c43aaf840-kube-api-access-g542s\") pod \"frr-k8s-webhook-server-bcc4b6f68-kbdhz\" (UID: \"61f44068-2b56-4919-83ac-1e6c43aaf840\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.250261 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-frr-conf\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.254049 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-rxmwb"] Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352059 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352108 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwnpc\" (UniqueName: \"kubernetes.io/projected/82866517-1e14-49c5-81be-88da5e861369-kube-api-access-wwnpc\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352132 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82866517-1e14-49c5-81be-88da5e861369-frr-startup\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352152 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwx5x\" (UniqueName: \"kubernetes.io/projected/9861c394-1567-4eeb-b487-979a4b725630-kube-api-access-gwx5x\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352168 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-metrics-certs\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352184 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx9fq\" (UniqueName: \"kubernetes.io/projected/de1d165d-e05b-4ae2-b059-d8635faa0323-kube-api-access-jx9fq\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352209 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9861c394-1567-4eeb-b487-979a4b725630-metallb-excludel2\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352245 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g542s\" (UniqueName: \"kubernetes.io/projected/61f44068-2b56-4919-83ac-1e6c43aaf840-kube-api-access-g542s\") pod \"frr-k8s-webhook-server-bcc4b6f68-kbdhz\" (UID: \"61f44068-2b56-4919-83ac-1e6c43aaf840\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352268 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-frr-conf\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352296 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61f44068-2b56-4919-83ac-1e6c43aaf840-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kbdhz\" (UID: \"61f44068-2b56-4919-83ac-1e6c43aaf840\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352311 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-metrics\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352329 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-frr-sockets\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352351 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-reloader\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352367 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de1d165d-e05b-4ae2-b059-d8635faa0323-cert\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352385 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de1d165d-e05b-4ae2-b059-d8635faa0323-metrics-certs\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.352406 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82866517-1e14-49c5-81be-88da5e861369-metrics-certs\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.355397 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-frr-sockets\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.355874 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-reloader\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.356022 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/82866517-1e14-49c5-81be-88da5e861369-frr-startup\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.356236 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-metrics\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.358261 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/82866517-1e14-49c5-81be-88da5e861369-frr-conf\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.361101 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/61f44068-2b56-4919-83ac-1e6c43aaf840-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-kbdhz\" (UID: \"61f44068-2b56-4919-83ac-1e6c43aaf840\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.365233 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/82866517-1e14-49c5-81be-88da5e861369-metrics-certs\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.372789 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwnpc\" (UniqueName: \"kubernetes.io/projected/82866517-1e14-49c5-81be-88da5e861369-kube-api-access-wwnpc\") pod \"frr-k8s-9k4zx\" (UID: \"82866517-1e14-49c5-81be-88da5e861369\") " pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.374348 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g542s\" (UniqueName: \"kubernetes.io/projected/61f44068-2b56-4919-83ac-1e6c43aaf840-kube-api-access-g542s\") pod \"frr-k8s-webhook-server-bcc4b6f68-kbdhz\" (UID: \"61f44068-2b56-4919-83ac-1e6c43aaf840\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.441199 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.452527 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.453864 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de1d165d-e05b-4ae2-b059-d8635faa0323-cert\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.453931 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de1d165d-e05b-4ae2-b059-d8635faa0323-metrics-certs\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.453992 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.454048 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwx5x\" (UniqueName: \"kubernetes.io/projected/9861c394-1567-4eeb-b487-979a4b725630-kube-api-access-gwx5x\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.454074 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-metrics-certs\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.454096 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx9fq\" (UniqueName: \"kubernetes.io/projected/de1d165d-e05b-4ae2-b059-d8635faa0323-kube-api-access-jx9fq\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.454138 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9861c394-1567-4eeb-b487-979a4b725630-metallb-excludel2\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: E0320 13:39:55.455168 4849 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:39:55 crc kubenswrapper[4849]: E0320 13:39:55.455235 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist podName:9861c394-1567-4eeb-b487-979a4b725630 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:55.955217117 +0000 UTC m=+945.632940532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist") pod "speaker-zmzbp" (UID: "9861c394-1567-4eeb-b487-979a4b725630") : secret "metallb-memberlist" not found Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.455864 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/9861c394-1567-4eeb-b487-979a4b725630-metallb-excludel2\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.457530 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-metrics-certs\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.457922 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de1d165d-e05b-4ae2-b059-d8635faa0323-cert\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.458470 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de1d165d-e05b-4ae2-b059-d8635faa0323-metrics-certs\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.471723 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx9fq\" (UniqueName: \"kubernetes.io/projected/de1d165d-e05b-4ae2-b059-d8635faa0323-kube-api-access-jx9fq\") pod \"controller-7bb4cc7c98-rxmwb\" (UID: \"de1d165d-e05b-4ae2-b059-d8635faa0323\") " pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.475293 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwx5x\" (UniqueName: \"kubernetes.io/projected/9861c394-1567-4eeb-b487-979a4b725630-kube-api-access-gwx5x\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.555219 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.811610 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-rxmwb"] Mar 20 13:39:55 crc kubenswrapper[4849]: W0320 13:39:55.817853 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde1d165d_e05b_4ae2_b059_d8635faa0323.slice/crio-04ae72e907e066667310f08afb2fcaeec2a6d7dae052e90a85d8e5e437a29ee9 WatchSource:0}: Error finding container 04ae72e907e066667310f08afb2fcaeec2a6d7dae052e90a85d8e5e437a29ee9: Status 404 returned error can't find the container with id 04ae72e907e066667310f08afb2fcaeec2a6d7dae052e90a85d8e5e437a29ee9 Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.844809 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz"] Mar 20 13:39:55 crc kubenswrapper[4849]: W0320 13:39:55.849260 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f44068_2b56_4919_83ac_1e6c43aaf840.slice/crio-93466253fe6d68aae8ce9a8788c9ddd43c03b909feb92e5a4739f0a04786d7a7 WatchSource:0}: Error finding container 93466253fe6d68aae8ce9a8788c9ddd43c03b909feb92e5a4739f0a04786d7a7: Status 404 returned error can't find the container with id 93466253fe6d68aae8ce9a8788c9ddd43c03b909feb92e5a4739f0a04786d7a7 Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.933730 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"f87ec97d22ad6bb089bd333b970abe14834f8edf52ea80fb8c0b51ef018d0f02"} Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.935198 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-rxmwb" event={"ID":"de1d165d-e05b-4ae2-b059-d8635faa0323","Type":"ContainerStarted","Data":"78c9a8c01bc4cbbfb57cb18c415bef3282623d01050ec4721e0c62a7afe52796"} Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.935264 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-rxmwb" event={"ID":"de1d165d-e05b-4ae2-b059-d8635faa0323","Type":"ContainerStarted","Data":"04ae72e907e066667310f08afb2fcaeec2a6d7dae052e90a85d8e5e437a29ee9"} Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.935981 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" event={"ID":"61f44068-2b56-4919-83ac-1e6c43aaf840","Type":"ContainerStarted","Data":"93466253fe6d68aae8ce9a8788c9ddd43c03b909feb92e5a4739f0a04786d7a7"} Mar 20 13:39:55 crc kubenswrapper[4849]: I0320 13:39:55.960287 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:55 crc kubenswrapper[4849]: E0320 13:39:55.960440 4849 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:39:55 crc kubenswrapper[4849]: E0320 13:39:55.960494 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist podName:9861c394-1567-4eeb-b487-979a4b725630 nodeName:}" failed. No retries permitted until 2026-03-20 13:39:56.960479061 +0000 UTC m=+946.638202456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist") pod "speaker-zmzbp" (UID: "9861c394-1567-4eeb-b487-979a4b725630") : secret "metallb-memberlist" not found Mar 20 13:39:56 crc kubenswrapper[4849]: I0320 13:39:56.956406 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-rxmwb" event={"ID":"de1d165d-e05b-4ae2-b059-d8635faa0323","Type":"ContainerStarted","Data":"57ccfae64137269f202e5b378c6d8fbedd294a0fcc27fdaa01825aa6d2b076f0"} Mar 20 13:39:56 crc kubenswrapper[4849]: I0320 13:39:56.956693 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:39:56 crc kubenswrapper[4849]: I0320 13:39:56.972406 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:56 crc kubenswrapper[4849]: I0320 13:39:56.974007 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-rxmwb" podStartSLOduration=1.973987556 podStartE2EDuration="1.973987556s" podCreationTimestamp="2026-03-20 13:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:39:56.972162148 +0000 UTC m=+946.649885563" watchObservedRunningTime="2026-03-20 13:39:56.973987556 +0000 UTC m=+946.651710951" Mar 20 13:39:56 crc kubenswrapper[4849]: I0320 13:39:56.978268 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/9861c394-1567-4eeb-b487-979a4b725630-memberlist\") pod \"speaker-zmzbp\" (UID: \"9861c394-1567-4eeb-b487-979a4b725630\") " pod="metallb-system/speaker-zmzbp" Mar 20 13:39:57 crc kubenswrapper[4849]: I0320 13:39:57.041069 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zmzbp" Mar 20 13:39:57 crc kubenswrapper[4849]: W0320 13:39:57.067357 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9861c394_1567_4eeb_b487_979a4b725630.slice/crio-7494a182708ee13f8c900f4960edef51c8657c1179966f91a5ad0207f9108c22 WatchSource:0}: Error finding container 7494a182708ee13f8c900f4960edef51c8657c1179966f91a5ad0207f9108c22: Status 404 returned error can't find the container with id 7494a182708ee13f8c900f4960edef51c8657c1179966f91a5ad0207f9108c22 Mar 20 13:39:57 crc kubenswrapper[4849]: I0320 13:39:57.971237 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zmzbp" event={"ID":"9861c394-1567-4eeb-b487-979a4b725630","Type":"ContainerStarted","Data":"0e75a97a00addc50068af2d919cf6d8399ea73eaf5a3f079dde25a1dcb4ce714"} Mar 20 13:39:57 crc kubenswrapper[4849]: I0320 13:39:57.971478 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zmzbp" event={"ID":"9861c394-1567-4eeb-b487-979a4b725630","Type":"ContainerStarted","Data":"2e47c9e11d221b0cf60ca73092c19279ca934002eb5eac21679c35f9b89644b9"} Mar 20 13:39:57 crc kubenswrapper[4849]: I0320 13:39:57.971490 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zmzbp" event={"ID":"9861c394-1567-4eeb-b487-979a4b725630","Type":"ContainerStarted","Data":"7494a182708ee13f8c900f4960edef51c8657c1179966f91a5ad0207f9108c22"} Mar 20 13:39:57 crc kubenswrapper[4849]: I0320 13:39:57.971954 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zmzbp" Mar 20 13:39:57 crc kubenswrapper[4849]: I0320 13:39:57.990915 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zmzbp" podStartSLOduration=2.990891743 podStartE2EDuration="2.990891743s" podCreationTimestamp="2026-03-20 13:39:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:39:57.98627091 +0000 UTC m=+947.663994315" watchObservedRunningTime="2026-03-20 13:39:57.990891743 +0000 UTC m=+947.668615148" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.130357 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566900-bwrb8"] Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.131487 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.134843 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.134858 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.135057 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.139628 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-bwrb8"] Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.339083 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmn9r\" (UniqueName: \"kubernetes.io/projected/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0-kube-api-access-qmn9r\") pod \"auto-csr-approver-29566900-bwrb8\" (UID: \"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0\") " pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.440092 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmn9r\" (UniqueName: \"kubernetes.io/projected/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0-kube-api-access-qmn9r\") pod \"auto-csr-approver-29566900-bwrb8\" (UID: \"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0\") " pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.457772 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmn9r\" (UniqueName: \"kubernetes.io/projected/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0-kube-api-access-qmn9r\") pod \"auto-csr-approver-29566900-bwrb8\" (UID: \"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0\") " pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:00 crc kubenswrapper[4849]: I0320 13:40:00.753411 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:03 crc kubenswrapper[4849]: I0320 13:40:03.293490 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-bwrb8"] Mar 20 13:40:03 crc kubenswrapper[4849]: W0320 13:40:03.300088 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14cb3d0f_a0ee_4565_a3a5_b2ffa27586d0.slice/crio-dc51cf5049351c94aab6bbfadf621ac5c8d56e84386eda463b4ca7ab3f85f8b0 WatchSource:0}: Error finding container dc51cf5049351c94aab6bbfadf621ac5c8d56e84386eda463b4ca7ab3f85f8b0: Status 404 returned error can't find the container with id dc51cf5049351c94aab6bbfadf621ac5c8d56e84386eda463b4ca7ab3f85f8b0 Mar 20 13:40:04 crc kubenswrapper[4849]: I0320 13:40:04.011344 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" event={"ID":"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0","Type":"ContainerStarted","Data":"dc51cf5049351c94aab6bbfadf621ac5c8d56e84386eda463b4ca7ab3f85f8b0"} Mar 20 13:40:04 crc kubenswrapper[4849]: I0320 13:40:04.013230 4849 generic.go:334] "Generic (PLEG): container finished" podID="82866517-1e14-49c5-81be-88da5e861369" containerID="4a2383736018d793f2b51009e9afcf5cabe8e090584bb4b2215ac34df6cf478c" exitCode=0 Mar 20 13:40:04 crc kubenswrapper[4849]: I0320 13:40:04.013301 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerDied","Data":"4a2383736018d793f2b51009e9afcf5cabe8e090584bb4b2215ac34df6cf478c"} Mar 20 13:40:04 crc kubenswrapper[4849]: I0320 13:40:04.015898 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" event={"ID":"61f44068-2b56-4919-83ac-1e6c43aaf840","Type":"ContainerStarted","Data":"8ef95c2b33d7b0e0de35de8ad6f3f2c37f19282dcce9461454118009a33db548"} Mar 20 13:40:04 crc kubenswrapper[4849]: I0320 13:40:04.016069 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:40:04 crc kubenswrapper[4849]: I0320 13:40:04.059033 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" podStartSLOduration=1.9115702319999999 podStartE2EDuration="9.059011355s" podCreationTimestamp="2026-03-20 13:39:55 +0000 UTC" firstStartedPulling="2026-03-20 13:39:55.851734431 +0000 UTC m=+945.529457826" lastFinishedPulling="2026-03-20 13:40:02.999175434 +0000 UTC m=+952.676898949" observedRunningTime="2026-03-20 13:40:04.057556836 +0000 UTC m=+953.735280251" watchObservedRunningTime="2026-03-20 13:40:04.059011355 +0000 UTC m=+953.736734750" Mar 20 13:40:05 crc kubenswrapper[4849]: I0320 13:40:05.023984 4849 generic.go:334] "Generic (PLEG): container finished" podID="82866517-1e14-49c5-81be-88da5e861369" containerID="35c0af576f82d19b8b4b0f79b1c20fdd664bb058cb21d3d40f78c82ff22114ec" exitCode=0 Mar 20 13:40:05 crc kubenswrapper[4849]: I0320 13:40:05.024043 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerDied","Data":"35c0af576f82d19b8b4b0f79b1c20fdd664bb058cb21d3d40f78c82ff22114ec"} Mar 20 13:40:05 crc kubenswrapper[4849]: I0320 13:40:05.027066 4849 generic.go:334] "Generic (PLEG): container finished" podID="14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0" containerID="b04e87520cc6d45e6d96f9d5d18f2768dbc9367da2e4fa7e0850e700f0134cda" exitCode=0 Mar 20 13:40:05 crc kubenswrapper[4849]: I0320 13:40:05.027122 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" event={"ID":"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0","Type":"ContainerDied","Data":"b04e87520cc6d45e6d96f9d5d18f2768dbc9367da2e4fa7e0850e700f0134cda"} Mar 20 13:40:05 crc kubenswrapper[4849]: I0320 13:40:05.558665 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-rxmwb" Mar 20 13:40:06 crc kubenswrapper[4849]: I0320 13:40:06.034700 4849 generic.go:334] "Generic (PLEG): container finished" podID="82866517-1e14-49c5-81be-88da5e861369" containerID="c4958d7f35ae6e55fa8bde971c40a45433ae578edeb58bef44a6ace5fd934a61" exitCode=0 Mar 20 13:40:06 crc kubenswrapper[4849]: I0320 13:40:06.034803 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerDied","Data":"c4958d7f35ae6e55fa8bde971c40a45433ae578edeb58bef44a6ace5fd934a61"} Mar 20 13:40:06 crc kubenswrapper[4849]: I0320 13:40:06.302382 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:06 crc kubenswrapper[4849]: I0320 13:40:06.434276 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmn9r\" (UniqueName: \"kubernetes.io/projected/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0-kube-api-access-qmn9r\") pod \"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0\" (UID: \"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0\") " Mar 20 13:40:06 crc kubenswrapper[4849]: I0320 13:40:06.452001 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0-kube-api-access-qmn9r" (OuterVolumeSpecName: "kube-api-access-qmn9r") pod "14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0" (UID: "14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0"). InnerVolumeSpecName "kube-api-access-qmn9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:06 crc kubenswrapper[4849]: I0320 13:40:06.536579 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmn9r\" (UniqueName: \"kubernetes.io/projected/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0-kube-api-access-qmn9r\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.046079 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.050947 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zmzbp" Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.051003 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-bwrb8" event={"ID":"14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0","Type":"ContainerDied","Data":"dc51cf5049351c94aab6bbfadf621ac5c8d56e84386eda463b4ca7ab3f85f8b0"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.051026 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc51cf5049351c94aab6bbfadf621ac5c8d56e84386eda463b4ca7ab3f85f8b0" Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052349 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"ebeb938bb273fdf6c2d7d083ca0710d7a3acf2f6a4ab17aa601ea673f8d285f8"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052372 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"391db7a10ece87218956f9e10072b47a2f09c8452910f398c8d5b7fbf117faa6"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052381 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"89a48f0d049a852838f8ed60108f4ebc049093dc29d62a584f20f5e730ad809f"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052389 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"bcf4b6e61807df855c4c6dea314d0d690128d40a6c9870f212b559a2a4bb064a"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052397 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"ddc651e076b37f97b58db23c915b151e769f3c64366756a3072553ee5bc9b5c3"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052407 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9k4zx" event={"ID":"82866517-1e14-49c5-81be-88da5e861369","Type":"ContainerStarted","Data":"881704821dcc01aa8d9ed93f30c144df4916bc86fe8ee1dc0ca246f56023899b"} Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.052540 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.096987 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9k4zx" podStartSLOduration=4.685036609 podStartE2EDuration="12.096970325s" podCreationTimestamp="2026-03-20 13:39:55 +0000 UTC" firstStartedPulling="2026-03-20 13:39:55.616842907 +0000 UTC m=+945.294566302" lastFinishedPulling="2026-03-20 13:40:03.028776623 +0000 UTC m=+952.706500018" observedRunningTime="2026-03-20 13:40:07.090689948 +0000 UTC m=+956.768413363" watchObservedRunningTime="2026-03-20 13:40:07.096970325 +0000 UTC m=+956.774693720" Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.350281 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-rdk6s"] Mar 20 13:40:07 crc kubenswrapper[4849]: I0320 13:40:07.354208 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-rdk6s"] Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.042946 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6980a35c-419b-4198-8d99-788b37127584" path="/var/lib/kubelet/pods/6980a35c-419b-4198-8d99-788b37127584/volumes" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.760579 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5k8c2"] Mar 20 13:40:09 crc kubenswrapper[4849]: E0320 13:40:09.761115 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0" containerName="oc" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.761128 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0" containerName="oc" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.761227 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0" containerName="oc" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.761644 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.764314 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.764973 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.765184 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-b44m2" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.774479 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5k8c2"] Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.880474 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5fz\" (UniqueName: \"kubernetes.io/projected/b8322833-a0f0-485a-855b-7a2c0090d73b-kube-api-access-xt5fz\") pod \"openstack-operator-index-5k8c2\" (UID: \"b8322833-a0f0-485a-855b-7a2c0090d73b\") " pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.981610 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5fz\" (UniqueName: \"kubernetes.io/projected/b8322833-a0f0-485a-855b-7a2c0090d73b-kube-api-access-xt5fz\") pod \"openstack-operator-index-5k8c2\" (UID: \"b8322833-a0f0-485a-855b-7a2c0090d73b\") " pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:09 crc kubenswrapper[4849]: I0320 13:40:09.999228 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5fz\" (UniqueName: \"kubernetes.io/projected/b8322833-a0f0-485a-855b-7a2c0090d73b-kube-api-access-xt5fz\") pod \"openstack-operator-index-5k8c2\" (UID: \"b8322833-a0f0-485a-855b-7a2c0090d73b\") " pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:10 crc kubenswrapper[4849]: I0320 13:40:10.086896 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:10 crc kubenswrapper[4849]: I0320 13:40:10.324276 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5k8c2"] Mar 20 13:40:10 crc kubenswrapper[4849]: W0320 13:40:10.333289 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8322833_a0f0_485a_855b_7a2c0090d73b.slice/crio-880833fdee2aebf402f43cdd3ba9636a7b06c879afd91d4414d38e65beb8bdc0 WatchSource:0}: Error finding container 880833fdee2aebf402f43cdd3ba9636a7b06c879afd91d4414d38e65beb8bdc0: Status 404 returned error can't find the container with id 880833fdee2aebf402f43cdd3ba9636a7b06c879afd91d4414d38e65beb8bdc0 Mar 20 13:40:10 crc kubenswrapper[4849]: I0320 13:40:10.442052 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:40:10 crc kubenswrapper[4849]: I0320 13:40:10.476810 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:40:11 crc kubenswrapper[4849]: I0320 13:40:11.077213 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5k8c2" event={"ID":"b8322833-a0f0-485a-855b-7a2c0090d73b","Type":"ContainerStarted","Data":"880833fdee2aebf402f43cdd3ba9636a7b06c879afd91d4414d38e65beb8bdc0"} Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.102324 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5k8c2" event={"ID":"b8322833-a0f0-485a-855b-7a2c0090d73b","Type":"ContainerStarted","Data":"df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0"} Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.119434 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5k8c2" podStartSLOduration=1.8363408589999999 podStartE2EDuration="4.119404417s" podCreationTimestamp="2026-03-20 13:40:09 +0000 UTC" firstStartedPulling="2026-03-20 13:40:10.339746977 +0000 UTC m=+960.017470372" lastFinishedPulling="2026-03-20 13:40:12.622810535 +0000 UTC m=+962.300533930" observedRunningTime="2026-03-20 13:40:13.116443448 +0000 UTC m=+962.794166863" watchObservedRunningTime="2026-03-20 13:40:13.119404417 +0000 UTC m=+962.797127852" Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.143905 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5k8c2"] Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.750197 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ntf6j"] Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.751397 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.762509 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ntf6j"] Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.841605 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvl4b\" (UniqueName: \"kubernetes.io/projected/eb632334-854a-446f-8964-4bc5812a7638-kube-api-access-tvl4b\") pod \"openstack-operator-index-ntf6j\" (UID: \"eb632334-854a-446f-8964-4bc5812a7638\") " pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.943666 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvl4b\" (UniqueName: \"kubernetes.io/projected/eb632334-854a-446f-8964-4bc5812a7638-kube-api-access-tvl4b\") pod \"openstack-operator-index-ntf6j\" (UID: \"eb632334-854a-446f-8964-4bc5812a7638\") " pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:13 crc kubenswrapper[4849]: I0320 13:40:13.967961 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvl4b\" (UniqueName: \"kubernetes.io/projected/eb632334-854a-446f-8964-4bc5812a7638-kube-api-access-tvl4b\") pod \"openstack-operator-index-ntf6j\" (UID: \"eb632334-854a-446f-8964-4bc5812a7638\") " pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:14 crc kubenswrapper[4849]: I0320 13:40:14.114429 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:14 crc kubenswrapper[4849]: I0320 13:40:14.592717 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ntf6j"] Mar 20 13:40:14 crc kubenswrapper[4849]: W0320 13:40:14.599235 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb632334_854a_446f_8964_4bc5812a7638.slice/crio-eccfe12de081824f4f1e474a54a29ee0f3b966bb7f639fad70a2fc260884298e WatchSource:0}: Error finding container eccfe12de081824f4f1e474a54a29ee0f3b966bb7f639fad70a2fc260884298e: Status 404 returned error can't find the container with id eccfe12de081824f4f1e474a54a29ee0f3b966bb7f639fad70a2fc260884298e Mar 20 13:40:15 crc kubenswrapper[4849]: I0320 13:40:15.119908 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ntf6j" event={"ID":"eb632334-854a-446f-8964-4bc5812a7638","Type":"ContainerStarted","Data":"eccfe12de081824f4f1e474a54a29ee0f3b966bb7f639fad70a2fc260884298e"} Mar 20 13:40:15 crc kubenswrapper[4849]: I0320 13:40:15.120168 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5k8c2" podUID="b8322833-a0f0-485a-855b-7a2c0090d73b" containerName="registry-server" containerID="cri-o://df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0" gracePeriod=2 Mar 20 13:40:15 crc kubenswrapper[4849]: I0320 13:40:15.463414 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-kbdhz" Mar 20 13:40:15 crc kubenswrapper[4849]: I0320 13:40:15.986458 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.074936 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt5fz\" (UniqueName: \"kubernetes.io/projected/b8322833-a0f0-485a-855b-7a2c0090d73b-kube-api-access-xt5fz\") pod \"b8322833-a0f0-485a-855b-7a2c0090d73b\" (UID: \"b8322833-a0f0-485a-855b-7a2c0090d73b\") " Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.081808 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8322833-a0f0-485a-855b-7a2c0090d73b-kube-api-access-xt5fz" (OuterVolumeSpecName: "kube-api-access-xt5fz") pod "b8322833-a0f0-485a-855b-7a2c0090d73b" (UID: "b8322833-a0f0-485a-855b-7a2c0090d73b"). InnerVolumeSpecName "kube-api-access-xt5fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.129443 4849 generic.go:334] "Generic (PLEG): container finished" podID="b8322833-a0f0-485a-855b-7a2c0090d73b" containerID="df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0" exitCode=0 Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.129500 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5k8c2" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.129581 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5k8c2" event={"ID":"b8322833-a0f0-485a-855b-7a2c0090d73b","Type":"ContainerDied","Data":"df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0"} Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.129626 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5k8c2" event={"ID":"b8322833-a0f0-485a-855b-7a2c0090d73b","Type":"ContainerDied","Data":"880833fdee2aebf402f43cdd3ba9636a7b06c879afd91d4414d38e65beb8bdc0"} Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.129649 4849 scope.go:117] "RemoveContainer" containerID="df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.131266 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ntf6j" event={"ID":"eb632334-854a-446f-8964-4bc5812a7638","Type":"ContainerStarted","Data":"7313e5930ee55ba22a099b33d3e9b1a292132a488ba2660cc47a4d2b85cfe081"} Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.153293 4849 scope.go:117] "RemoveContainer" containerID="df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0" Mar 20 13:40:16 crc kubenswrapper[4849]: E0320 13:40:16.153837 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0\": container with ID starting with df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0 not found: ID does not exist" containerID="df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.153883 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0"} err="failed to get container status \"df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0\": rpc error: code = NotFound desc = could not find container \"df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0\": container with ID starting with df0cd9161c346bf2758f2d9a40bbf1f71209f56ec7ba3d5b7b94bb07d814f6a0 not found: ID does not exist" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.167741 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ntf6j" podStartSLOduration=2.4508789269999998 podStartE2EDuration="3.167717081s" podCreationTimestamp="2026-03-20 13:40:13 +0000 UTC" firstStartedPulling="2026-03-20 13:40:14.603616664 +0000 UTC m=+964.281340099" lastFinishedPulling="2026-03-20 13:40:15.320454838 +0000 UTC m=+964.998178253" observedRunningTime="2026-03-20 13:40:16.149190697 +0000 UTC m=+965.826914132" watchObservedRunningTime="2026-03-20 13:40:16.167717081 +0000 UTC m=+965.845440486" Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.168286 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5k8c2"] Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.173816 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5k8c2"] Mar 20 13:40:16 crc kubenswrapper[4849]: I0320 13:40:16.177673 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt5fz\" (UniqueName: \"kubernetes.io/projected/b8322833-a0f0-485a-855b-7a2c0090d73b-kube-api-access-xt5fz\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:17 crc kubenswrapper[4849]: I0320 13:40:17.043105 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8322833-a0f0-485a-855b-7a2c0090d73b" path="/var/lib/kubelet/pods/b8322833-a0f0-485a-855b-7a2c0090d73b/volumes" Mar 20 13:40:24 crc kubenswrapper[4849]: I0320 13:40:24.114903 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:24 crc kubenswrapper[4849]: I0320 13:40:24.115369 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:24 crc kubenswrapper[4849]: I0320 13:40:24.136416 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:24 crc kubenswrapper[4849]: I0320 13:40:24.213813 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ntf6j" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.189986 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4"] Mar 20 13:40:25 crc kubenswrapper[4849]: E0320 13:40:25.190542 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8322833-a0f0-485a-855b-7a2c0090d73b" containerName="registry-server" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.190562 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8322833-a0f0-485a-855b-7a2c0090d73b" containerName="registry-server" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.190744 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8322833-a0f0-485a-855b-7a2c0090d73b" containerName="registry-server" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.192082 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.196086 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-566fh" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.206087 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4"] Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.320424 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-bundle\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.320516 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqmlq\" (UniqueName: \"kubernetes.io/projected/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-kube-api-access-bqmlq\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.320551 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-util\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.421640 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-bundle\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.421738 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqmlq\" (UniqueName: \"kubernetes.io/projected/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-kube-api-access-bqmlq\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.421788 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-util\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.422210 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-bundle\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.422437 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-util\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.441929 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqmlq\" (UniqueName: \"kubernetes.io/projected/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-kube-api-access-bqmlq\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.444103 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9k4zx" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.508083 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:25 crc kubenswrapper[4849]: I0320 13:40:25.904260 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4"] Mar 20 13:40:25 crc kubenswrapper[4849]: W0320 13:40:25.922071 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3bf2e9e_5405_4c01_977d_0ad6960a13e9.slice/crio-2414f67b922cfd11b2af7c2ebb80c8f913a96ccf97c1fe11c158a9c0a02c7fdd WatchSource:0}: Error finding container 2414f67b922cfd11b2af7c2ebb80c8f913a96ccf97c1fe11c158a9c0a02c7fdd: Status 404 returned error can't find the container with id 2414f67b922cfd11b2af7c2ebb80c8f913a96ccf97c1fe11c158a9c0a02c7fdd Mar 20 13:40:26 crc kubenswrapper[4849]: I0320 13:40:26.204309 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" event={"ID":"b3bf2e9e-5405-4c01-977d-0ad6960a13e9","Type":"ContainerStarted","Data":"7a877eda7d9751a1deaa91bef7099b345e0fd18a0c1006118a954438500ac30c"} Mar 20 13:40:26 crc kubenswrapper[4849]: I0320 13:40:26.204372 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" event={"ID":"b3bf2e9e-5405-4c01-977d-0ad6960a13e9","Type":"ContainerStarted","Data":"2414f67b922cfd11b2af7c2ebb80c8f913a96ccf97c1fe11c158a9c0a02c7fdd"} Mar 20 13:40:27 crc kubenswrapper[4849]: I0320 13:40:27.211580 4849 generic.go:334] "Generic (PLEG): container finished" podID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerID="7a877eda7d9751a1deaa91bef7099b345e0fd18a0c1006118a954438500ac30c" exitCode=0 Mar 20 13:40:27 crc kubenswrapper[4849]: I0320 13:40:27.211899 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" event={"ID":"b3bf2e9e-5405-4c01-977d-0ad6960a13e9","Type":"ContainerDied","Data":"7a877eda7d9751a1deaa91bef7099b345e0fd18a0c1006118a954438500ac30c"} Mar 20 13:40:28 crc kubenswrapper[4849]: I0320 13:40:28.220918 4849 generic.go:334] "Generic (PLEG): container finished" podID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerID="9236c85ce7ba594827a2466919ce71e6cfd2796621b7a98df7a5e69accb46f3a" exitCode=0 Mar 20 13:40:28 crc kubenswrapper[4849]: I0320 13:40:28.221051 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" event={"ID":"b3bf2e9e-5405-4c01-977d-0ad6960a13e9","Type":"ContainerDied","Data":"9236c85ce7ba594827a2466919ce71e6cfd2796621b7a98df7a5e69accb46f3a"} Mar 20 13:40:29 crc kubenswrapper[4849]: I0320 13:40:29.234000 4849 generic.go:334] "Generic (PLEG): container finished" podID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerID="331fa7fa990da667defc452b11ce03d5ad4ab7abb34d7b7fda2b74577ed0716f" exitCode=0 Mar 20 13:40:29 crc kubenswrapper[4849]: I0320 13:40:29.234075 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" event={"ID":"b3bf2e9e-5405-4c01-977d-0ad6960a13e9","Type":"ContainerDied","Data":"331fa7fa990da667defc452b11ce03d5ad4ab7abb34d7b7fda2b74577ed0716f"} Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.636894 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.705456 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqmlq\" (UniqueName: \"kubernetes.io/projected/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-kube-api-access-bqmlq\") pod \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.705594 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-bundle\") pod \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.705638 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-util\") pod \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\" (UID: \"b3bf2e9e-5405-4c01-977d-0ad6960a13e9\") " Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.707223 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-bundle" (OuterVolumeSpecName: "bundle") pod "b3bf2e9e-5405-4c01-977d-0ad6960a13e9" (UID: "b3bf2e9e-5405-4c01-977d-0ad6960a13e9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.711611 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-kube-api-access-bqmlq" (OuterVolumeSpecName: "kube-api-access-bqmlq") pod "b3bf2e9e-5405-4c01-977d-0ad6960a13e9" (UID: "b3bf2e9e-5405-4c01-977d-0ad6960a13e9"). InnerVolumeSpecName "kube-api-access-bqmlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.739022 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-util" (OuterVolumeSpecName: "util") pod "b3bf2e9e-5405-4c01-977d-0ad6960a13e9" (UID: "b3bf2e9e-5405-4c01-977d-0ad6960a13e9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.808078 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqmlq\" (UniqueName: \"kubernetes.io/projected/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-kube-api-access-bqmlq\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.808118 4849 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:30 crc kubenswrapper[4849]: I0320 13:40:30.808131 4849 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3bf2e9e-5405-4c01-977d-0ad6960a13e9-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:31 crc kubenswrapper[4849]: I0320 13:40:31.274963 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" event={"ID":"b3bf2e9e-5405-4c01-977d-0ad6960a13e9","Type":"ContainerDied","Data":"2414f67b922cfd11b2af7c2ebb80c8f913a96ccf97c1fe11c158a9c0a02c7fdd"} Mar 20 13:40:31 crc kubenswrapper[4849]: I0320 13:40:31.275013 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2414f67b922cfd11b2af7c2ebb80c8f913a96ccf97c1fe11c158a9c0a02c7fdd" Mar 20 13:40:31 crc kubenswrapper[4849]: I0320 13:40:31.275108 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4" Mar 20 13:40:32 crc kubenswrapper[4849]: I0320 13:40:32.944995 4849 scope.go:117] "RemoveContainer" containerID="a358dfe2a6c00c79f8bc73dd703386dfb4767430cef26ca2fec949fb6093ad30" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.360472 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz"] Mar 20 13:40:37 crc kubenswrapper[4849]: E0320 13:40:37.361249 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="util" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.361264 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="util" Mar 20 13:40:37 crc kubenswrapper[4849]: E0320 13:40:37.361279 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="pull" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.361286 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="pull" Mar 20 13:40:37 crc kubenswrapper[4849]: E0320 13:40:37.361297 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="extract" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.361304 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="extract" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.361447 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3bf2e9e-5405-4c01-977d-0ad6960a13e9" containerName="extract" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.361987 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.363662 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-r8v78" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.433575 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz"] Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.498996 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6kt\" (UniqueName: \"kubernetes.io/projected/3a778a1c-47e6-4f37-8bad-08edb0324503-kube-api-access-xm6kt\") pod \"openstack-operator-controller-init-59b5998766-k4qxz\" (UID: \"3a778a1c-47e6-4f37-8bad-08edb0324503\") " pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.600198 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6kt\" (UniqueName: \"kubernetes.io/projected/3a778a1c-47e6-4f37-8bad-08edb0324503-kube-api-access-xm6kt\") pod \"openstack-operator-controller-init-59b5998766-k4qxz\" (UID: \"3a778a1c-47e6-4f37-8bad-08edb0324503\") " pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.633990 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6kt\" (UniqueName: \"kubernetes.io/projected/3a778a1c-47e6-4f37-8bad-08edb0324503-kube-api-access-xm6kt\") pod \"openstack-operator-controller-init-59b5998766-k4qxz\" (UID: \"3a778a1c-47e6-4f37-8bad-08edb0324503\") " pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:40:37 crc kubenswrapper[4849]: I0320 13:40:37.678706 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:40:38 crc kubenswrapper[4849]: I0320 13:40:38.092091 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz"] Mar 20 13:40:38 crc kubenswrapper[4849]: W0320 13:40:38.101084 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a778a1c_47e6_4f37_8bad_08edb0324503.slice/crio-792ac12ee68b3e9ec569d9446b2019accfcd0a34a833901a3b10e9b3014e3c09 WatchSource:0}: Error finding container 792ac12ee68b3e9ec569d9446b2019accfcd0a34a833901a3b10e9b3014e3c09: Status 404 returned error can't find the container with id 792ac12ee68b3e9ec569d9446b2019accfcd0a34a833901a3b10e9b3014e3c09 Mar 20 13:40:38 crc kubenswrapper[4849]: I0320 13:40:38.321556 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" event={"ID":"3a778a1c-47e6-4f37-8bad-08edb0324503","Type":"ContainerStarted","Data":"792ac12ee68b3e9ec569d9446b2019accfcd0a34a833901a3b10e9b3014e3c09"} Mar 20 13:40:39 crc kubenswrapper[4849]: I0320 13:40:39.385161 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:40:39 crc kubenswrapper[4849]: I0320 13:40:39.385237 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:40:42 crc kubenswrapper[4849]: I0320 13:40:42.350177 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" event={"ID":"3a778a1c-47e6-4f37-8bad-08edb0324503","Type":"ContainerStarted","Data":"300a0878386ae1d5dfa177d175fc0f06d60a06ab65fe18a1162bd066513fc685"} Mar 20 13:40:42 crc kubenswrapper[4849]: I0320 13:40:42.350502 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:40:42 crc kubenswrapper[4849]: I0320 13:40:42.376391 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" podStartSLOduration=1.9361790719999998 podStartE2EDuration="5.376372957s" podCreationTimestamp="2026-03-20 13:40:37 +0000 UTC" firstStartedPulling="2026-03-20 13:40:38.103426365 +0000 UTC m=+987.781149760" lastFinishedPulling="2026-03-20 13:40:41.54362025 +0000 UTC m=+991.221343645" observedRunningTime="2026-03-20 13:40:42.373645854 +0000 UTC m=+992.051369259" watchObservedRunningTime="2026-03-20 13:40:42.376372957 +0000 UTC m=+992.054096352" Mar 20 13:40:47 crc kubenswrapper[4849]: I0320 13:40:47.682233 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-59b5998766-k4qxz" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.340126 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b2ccv"] Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.341680 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.361711 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2ccv"] Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.430139 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-catalog-content\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.430189 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-utilities\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.430214 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24s7\" (UniqueName: \"kubernetes.io/projected/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-kube-api-access-t24s7\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.531432 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-catalog-content\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.531465 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-utilities\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.531489 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24s7\" (UniqueName: \"kubernetes.io/projected/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-kube-api-access-t24s7\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.531870 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-catalog-content\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.532006 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-utilities\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.565965 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24s7\" (UniqueName: \"kubernetes.io/projected/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-kube-api-access-t24s7\") pod \"certified-operators-b2ccv\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:01 crc kubenswrapper[4849]: I0320 13:41:01.657598 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:02 crc kubenswrapper[4849]: I0320 13:41:02.150391 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2ccv"] Mar 20 13:41:02 crc kubenswrapper[4849]: I0320 13:41:02.472772 4849 generic.go:334] "Generic (PLEG): container finished" podID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerID="c9f1822ca66ce840bc0b9d99c9142dfa59008e876345aa2e04fbcc09a832630c" exitCode=0 Mar 20 13:41:02 crc kubenswrapper[4849]: I0320 13:41:02.472814 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerDied","Data":"c9f1822ca66ce840bc0b9d99c9142dfa59008e876345aa2e04fbcc09a832630c"} Mar 20 13:41:02 crc kubenswrapper[4849]: I0320 13:41:02.472859 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerStarted","Data":"810f90eccf7567ae83648f9dff9c4434169acc34c14ec04a495a86618e67d4ab"} Mar 20 13:41:03 crc kubenswrapper[4849]: I0320 13:41:03.479135 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerStarted","Data":"797d929560363fa5741c6e5a4d986ee56b4452dfbd37b4176a9cc4a3e59d6a3d"} Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.486671 4849 generic.go:334] "Generic (PLEG): container finished" podID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerID="797d929560363fa5741c6e5a4d986ee56b4452dfbd37b4176a9cc4a3e59d6a3d" exitCode=0 Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.486749 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerDied","Data":"797d929560363fa5741c6e5a4d986ee56b4452dfbd37b4176a9cc4a3e59d6a3d"} Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.549803 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.550545 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.552710 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-k2gjg" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.565136 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.575496 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.576275 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.577959 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mtz6h" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.585878 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.586700 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.590536 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-tnvkk" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.604321 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.634912 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.649134 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.649988 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.654191 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-wwczl" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.654632 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.655539 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.658089 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rhjq2" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.662526 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.687920 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.690750 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfg2b\" (UniqueName: \"kubernetes.io/projected/b186179d-3d3c-4cd1-806b-d7d8682ac88f-kube-api-access-jfg2b\") pod \"cinder-operator-controller-manager-8d58dc466-55knt\" (UID: \"b186179d-3d3c-4cd1-806b-d7d8682ac88f\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.690850 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr795\" (UniqueName: \"kubernetes.io/projected/485ab391-8811-4d32-a7ce-de1f2c0cd1e5-kube-api-access-cr795\") pod \"barbican-operator-controller-manager-59bc569d95-mdm4l\" (UID: \"485ab391-8811-4d32-a7ce-de1f2c0cd1e5\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.690890 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjnxs\" (UniqueName: \"kubernetes.io/projected/377c37d3-9285-44fb-bcd7-1dba905a3133-kube-api-access-zjnxs\") pod \"designate-operator-controller-manager-588d4d986b-spl5d\" (UID: \"377c37d3-9285-44fb-bcd7-1dba905a3133\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.713878 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.714798 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.717135 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jqkj7" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.717366 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.718258 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.724913 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.725879 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.728923 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-ggwz5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.729201 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ktcdg" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.730510 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.730755 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.736353 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.744411 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.756890 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.758033 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.761340 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2dw7z" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.784699 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.787315 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.790589 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-b5cnv" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793162 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjnxs\" (UniqueName: \"kubernetes.io/projected/377c37d3-9285-44fb-bcd7-1dba905a3133-kube-api-access-zjnxs\") pod \"designate-operator-controller-manager-588d4d986b-spl5d\" (UID: \"377c37d3-9285-44fb-bcd7-1dba905a3133\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793267 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtbbf\" (UniqueName: \"kubernetes.io/projected/89f24131-b326-437f-8d55-ccc77b120d8a-kube-api-access-qtbbf\") pod \"glance-operator-controller-manager-79df6bcc97-gw4br\" (UID: \"89f24131-b326-437f-8d55-ccc77b120d8a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793322 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf6gr\" (UniqueName: \"kubernetes.io/projected/956cd6f9-4828-4304-9b85-12025b56b9d5-kube-api-access-lf6gr\") pod \"heat-operator-controller-manager-67dd5f86f5-bzttm\" (UID: \"956cd6f9-4828-4304-9b85-12025b56b9d5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793369 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfg2b\" (UniqueName: \"kubernetes.io/projected/b186179d-3d3c-4cd1-806b-d7d8682ac88f-kube-api-access-jfg2b\") pod \"cinder-operator-controller-manager-8d58dc466-55knt\" (UID: \"b186179d-3d3c-4cd1-806b-d7d8682ac88f\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793405 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnsq\" (UniqueName: \"kubernetes.io/projected/5345f1a2-c4af-46ca-b53a-acbb0cbcec04-kube-api-access-tqnsq\") pod \"ironic-operator-controller-manager-6f787dddc9-8thmf\" (UID: \"5345f1a2-c4af-46ca-b53a-acbb0cbcec04\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793491 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793542 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg5g\" (UniqueName: \"kubernetes.io/projected/bb530eb5-4963-4790-89f6-e21f33d2b254-kube-api-access-rqg5g\") pod \"horizon-operator-controller-manager-8464cc45fb-jvhjd\" (UID: \"bb530eb5-4963-4790-89f6-e21f33d2b254\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793590 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr795\" (UniqueName: \"kubernetes.io/projected/485ab391-8811-4d32-a7ce-de1f2c0cd1e5-kube-api-access-cr795\") pod \"barbican-operator-controller-manager-59bc569d95-mdm4l\" (UID: \"485ab391-8811-4d32-a7ce-de1f2c0cd1e5\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.793632 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89c5w\" (UniqueName: \"kubernetes.io/projected/03c23473-4b4c-4e24-92f6-69363a9cf363-kube-api-access-89c5w\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.801884 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.820343 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.824290 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zqj9k" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.831240 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfg2b\" (UniqueName: \"kubernetes.io/projected/b186179d-3d3c-4cd1-806b-d7d8682ac88f-kube-api-access-jfg2b\") pod \"cinder-operator-controller-manager-8d58dc466-55knt\" (UID: \"b186179d-3d3c-4cd1-806b-d7d8682ac88f\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.831465 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr795\" (UniqueName: \"kubernetes.io/projected/485ab391-8811-4d32-a7ce-de1f2c0cd1e5-kube-api-access-cr795\") pod \"barbican-operator-controller-manager-59bc569d95-mdm4l\" (UID: \"485ab391-8811-4d32-a7ce-de1f2c0cd1e5\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.845306 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.850527 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjnxs\" (UniqueName: \"kubernetes.io/projected/377c37d3-9285-44fb-bcd7-1dba905a3133-kube-api-access-zjnxs\") pod \"designate-operator-controller-manager-588d4d986b-spl5d\" (UID: \"377c37d3-9285-44fb-bcd7-1dba905a3133\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.865481 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.884997 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895097 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf6gr\" (UniqueName: \"kubernetes.io/projected/956cd6f9-4828-4304-9b85-12025b56b9d5-kube-api-access-lf6gr\") pod \"heat-operator-controller-manager-67dd5f86f5-bzttm\" (UID: \"956cd6f9-4828-4304-9b85-12025b56b9d5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895141 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnsq\" (UniqueName: \"kubernetes.io/projected/5345f1a2-c4af-46ca-b53a-acbb0cbcec04-kube-api-access-tqnsq\") pod \"ironic-operator-controller-manager-6f787dddc9-8thmf\" (UID: \"5345f1a2-c4af-46ca-b53a-acbb0cbcec04\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895174 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d29k\" (UniqueName: \"kubernetes.io/projected/eb969620-248a-4a3d-9377-e61dd62a263a-kube-api-access-8d29k\") pod \"manila-operator-controller-manager-55f864c847-m9dm8\" (UID: \"eb969620-248a-4a3d-9377-e61dd62a263a\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895204 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895223 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg5g\" (UniqueName: \"kubernetes.io/projected/bb530eb5-4963-4790-89f6-e21f33d2b254-kube-api-access-rqg5g\") pod \"horizon-operator-controller-manager-8464cc45fb-jvhjd\" (UID: \"bb530eb5-4963-4790-89f6-e21f33d2b254\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895250 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5sb\" (UniqueName: \"kubernetes.io/projected/94697f17-6007-4cd9-9eb6-04832d0e94c6-kube-api-access-7j5sb\") pod \"keystone-operator-controller-manager-768b96df4c-vvscl\" (UID: \"94697f17-6007-4cd9-9eb6-04832d0e94c6\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895273 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89c5w\" (UniqueName: \"kubernetes.io/projected/03c23473-4b4c-4e24-92f6-69363a9cf363-kube-api-access-89c5w\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895303 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldsvg\" (UniqueName: \"kubernetes.io/projected/3fdffb87-786f-4c4e-88fd-b1dd7bcf728d-kube-api-access-ldsvg\") pod \"mariadb-operator-controller-manager-67ccfc9778-drbhr\" (UID: \"3fdffb87-786f-4c4e-88fd-b1dd7bcf728d\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895326 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtbbf\" (UniqueName: \"kubernetes.io/projected/89f24131-b326-437f-8d55-ccc77b120d8a-kube-api-access-qtbbf\") pod \"glance-operator-controller-manager-79df6bcc97-gw4br\" (UID: \"89f24131-b326-437f-8d55-ccc77b120d8a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:04 crc kubenswrapper[4849]: E0320 13:41:04.895603 4849 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:04 crc kubenswrapper[4849]: E0320 13:41:04.895648 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert podName:03c23473-4b4c-4e24-92f6-69363a9cf363 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:05.395632701 +0000 UTC m=+1015.073356096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert") pod "infra-operator-controller-manager-669fff9c7c-v8lc5" (UID: "03c23473-4b4c-4e24-92f6-69363a9cf363") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.895867 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.910183 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.911151 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.911222 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.917278 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.917549 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c4z4q" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.917577 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.922361 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.925016 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.926449 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnsq\" (UniqueName: \"kubernetes.io/projected/5345f1a2-c4af-46ca-b53a-acbb0cbcec04-kube-api-access-tqnsq\") pod \"ironic-operator-controller-manager-6f787dddc9-8thmf\" (UID: \"5345f1a2-c4af-46ca-b53a-acbb0cbcec04\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.927866 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rfllh" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.928896 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89c5w\" (UniqueName: \"kubernetes.io/projected/03c23473-4b4c-4e24-92f6-69363a9cf363-kube-api-access-89c5w\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.932863 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf6gr\" (UniqueName: \"kubernetes.io/projected/956cd6f9-4828-4304-9b85-12025b56b9d5-kube-api-access-lf6gr\") pod \"heat-operator-controller-manager-67dd5f86f5-bzttm\" (UID: \"956cd6f9-4828-4304-9b85-12025b56b9d5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.934669 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtbbf\" (UniqueName: \"kubernetes.io/projected/89f24131-b326-437f-8d55-ccc77b120d8a-kube-api-access-qtbbf\") pod \"glance-operator-controller-manager-79df6bcc97-gw4br\" (UID: \"89f24131-b326-437f-8d55-ccc77b120d8a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.937021 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg5g\" (UniqueName: \"kubernetes.io/projected/bb530eb5-4963-4790-89f6-e21f33d2b254-kube-api-access-rqg5g\") pod \"horizon-operator-controller-manager-8464cc45fb-jvhjd\" (UID: \"bb530eb5-4963-4790-89f6-e21f33d2b254\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.939449 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.951959 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.959381 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.962710 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-c58fh" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.970441 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.976869 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.982749 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-b78d5"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.983466 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.984406 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.984915 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-h5plf" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.995132 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh"] Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.996245 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.997847 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.998788 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldsvg\" (UniqueName: \"kubernetes.io/projected/3fdffb87-786f-4c4e-88fd-b1dd7bcf728d-kube-api-access-ldsvg\") pod \"mariadb-operator-controller-manager-67ccfc9778-drbhr\" (UID: \"3fdffb87-786f-4c4e-88fd-b1dd7bcf728d\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.998870 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn4rz\" (UniqueName: \"kubernetes.io/projected/fe62d445-815f-4606-8cca-aa13f732c509-kube-api-access-dn4rz\") pod \"neutron-operator-controller-manager-767865f676-m6v2j\" (UID: \"fe62d445-815f-4606-8cca-aa13f732c509\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.998936 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d29k\" (UniqueName: \"kubernetes.io/projected/eb969620-248a-4a3d-9377-e61dd62a263a-kube-api-access-8d29k\") pod \"manila-operator-controller-manager-55f864c847-m9dm8\" (UID: \"eb969620-248a-4a3d-9377-e61dd62a263a\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.998972 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9fp\" (UniqueName: \"kubernetes.io/projected/f95c179b-0dc5-4cea-98e7-7df754c3c0e2-kube-api-access-jd9fp\") pod \"nova-operator-controller-manager-5d488d59fb-z2xd8\" (UID: \"f95c179b-0dc5-4cea-98e7-7df754c3c0e2\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:04 crc kubenswrapper[4849]: I0320 13:41:04.999078 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5sb\" (UniqueName: \"kubernetes.io/projected/94697f17-6007-4cd9-9eb6-04832d0e94c6-kube-api-access-7j5sb\") pod \"keystone-operator-controller-manager-768b96df4c-vvscl\" (UID: \"94697f17-6007-4cd9-9eb6-04832d0e94c6\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.000348 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9t7hz" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.000791 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-b78d5"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.012049 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.016887 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5sb\" (UniqueName: \"kubernetes.io/projected/94697f17-6007-4cd9-9eb6-04832d0e94c6-kube-api-access-7j5sb\") pod \"keystone-operator-controller-manager-768b96df4c-vvscl\" (UID: \"94697f17-6007-4cd9-9eb6-04832d0e94c6\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.025192 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d29k\" (UniqueName: \"kubernetes.io/projected/eb969620-248a-4a3d-9377-e61dd62a263a-kube-api-access-8d29k\") pod \"manila-operator-controller-manager-55f864c847-m9dm8\" (UID: \"eb969620-248a-4a3d-9377-e61dd62a263a\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.029385 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldsvg\" (UniqueName: \"kubernetes.io/projected/3fdffb87-786f-4c4e-88fd-b1dd7bcf728d-kube-api-access-ldsvg\") pod \"mariadb-operator-controller-manager-67ccfc9778-drbhr\" (UID: \"3fdffb87-786f-4c4e-88fd-b1dd7bcf728d\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.044533 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.055463 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d8d62"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.056265 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.058475 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zfvdd" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.059045 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d8d62"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.065055 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.066020 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.069583 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zgs5v" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.072012 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.078587 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.095131 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.096470 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.100030 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.100098 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9fp\" (UniqueName: \"kubernetes.io/projected/f95c179b-0dc5-4cea-98e7-7df754c3c0e2-kube-api-access-jd9fp\") pod \"nova-operator-controller-manager-5d488d59fb-z2xd8\" (UID: \"f95c179b-0dc5-4cea-98e7-7df754c3c0e2\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.100130 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzhx\" (UniqueName: \"kubernetes.io/projected/75598a7a-c554-416a-833a-5e2f1a40966e-kube-api-access-xkzhx\") pod \"ovn-operator-controller-manager-884679f54-b78d5\" (UID: \"75598a7a-c554-416a-833a-5e2f1a40966e\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.100216 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z2mq\" (UniqueName: \"kubernetes.io/projected/d8b2fadb-ac5a-4883-8f52-059f659844fb-kube-api-access-9z2mq\") pod \"octavia-operator-controller-manager-5b9f45d989-m4gkc\" (UID: \"d8b2fadb-ac5a-4883-8f52-059f659844fb\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.100268 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc8qn\" (UniqueName: \"kubernetes.io/projected/4996a7a1-3666-436e-b366-7f32c73cee02-kube-api-access-rc8qn\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.100338 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn4rz\" (UniqueName: \"kubernetes.io/projected/fe62d445-815f-4606-8cca-aa13f732c509-kube-api-access-dn4rz\") pod \"neutron-operator-controller-manager-767865f676-m6v2j\" (UID: \"fe62d445-815f-4606-8cca-aa13f732c509\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.106868 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.121397 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rvsbm" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.130099 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.130972 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.131484 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9fp\" (UniqueName: \"kubernetes.io/projected/f95c179b-0dc5-4cea-98e7-7df754c3c0e2-kube-api-access-jd9fp\") pod \"nova-operator-controller-manager-5d488d59fb-z2xd8\" (UID: \"f95c179b-0dc5-4cea-98e7-7df754c3c0e2\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.138760 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xfzdr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.155299 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn4rz\" (UniqueName: \"kubernetes.io/projected/fe62d445-815f-4606-8cca-aa13f732c509-kube-api-access-dn4rz\") pod \"neutron-operator-controller-manager-767865f676-m6v2j\" (UID: \"fe62d445-815f-4606-8cca-aa13f732c509\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.173095 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.174057 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.185050 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202139 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkzhx\" (UniqueName: \"kubernetes.io/projected/75598a7a-c554-416a-833a-5e2f1a40966e-kube-api-access-xkzhx\") pod \"ovn-operator-controller-manager-884679f54-b78d5\" (UID: \"75598a7a-c554-416a-833a-5e2f1a40966e\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202186 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsch\" (UniqueName: \"kubernetes.io/projected/001e060b-cc07-4327-a02f-2a8a9c593aa3-kube-api-access-nqsch\") pod \"placement-operator-controller-manager-5784578c99-d8d62\" (UID: \"001e060b-cc07-4327-a02f-2a8a9c593aa3\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202221 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z2mq\" (UniqueName: \"kubernetes.io/projected/d8b2fadb-ac5a-4883-8f52-059f659844fb-kube-api-access-9z2mq\") pod \"octavia-operator-controller-manager-5b9f45d989-m4gkc\" (UID: \"d8b2fadb-ac5a-4883-8f52-059f659844fb\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202252 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc8qn\" (UniqueName: \"kubernetes.io/projected/4996a7a1-3666-436e-b366-7f32c73cee02-kube-api-access-rc8qn\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202289 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nszfl\" (UniqueName: \"kubernetes.io/projected/c84791cf-2ae6-4edd-b8b4-449995825ee7-kube-api-access-nszfl\") pod \"swift-operator-controller-manager-c674c5965-6s2qr\" (UID: \"c84791cf-2ae6-4edd-b8b4-449995825ee7\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202305 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwgx6\" (UniqueName: \"kubernetes.io/projected/5930666f-c065-44ca-a66c-42d75ef8a0ef-kube-api-access-qwgx6\") pod \"test-operator-controller-manager-5c5cb9c4d7-kfh92\" (UID: \"5930666f-c065-44ca-a66c-42d75ef8a0ef\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202341 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dbm\" (UniqueName: \"kubernetes.io/projected/35796358-732f-4ec4-88e0-f121b509a14c-kube-api-access-b4dbm\") pod \"telemetry-operator-controller-manager-d6b694c5-frmv4\" (UID: \"35796358-732f-4ec4-88e0-f121b509a14c\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.202375 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.203269 4849 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.203306 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert podName:4996a7a1-3666-436e-b366-7f32c73cee02 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:05.703293795 +0000 UTC m=+1015.381017190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" (UID: "4996a7a1-3666-436e-b366-7f32c73cee02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.220277 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z2mq\" (UniqueName: \"kubernetes.io/projected/d8b2fadb-ac5a-4883-8f52-059f659844fb-kube-api-access-9z2mq\") pod \"octavia-operator-controller-manager-5b9f45d989-m4gkc\" (UID: \"d8b2fadb-ac5a-4883-8f52-059f659844fb\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.222932 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkzhx\" (UniqueName: \"kubernetes.io/projected/75598a7a-c554-416a-833a-5e2f1a40966e-kube-api-access-xkzhx\") pod \"ovn-operator-controller-manager-884679f54-b78d5\" (UID: \"75598a7a-c554-416a-833a-5e2f1a40966e\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.224277 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc8qn\" (UniqueName: \"kubernetes.io/projected/4996a7a1-3666-436e-b366-7f32c73cee02-kube-api-access-rc8qn\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.256902 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.257785 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.261508 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-c8n8w" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.275941 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.283785 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.295078 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.307100 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwgx6\" (UniqueName: \"kubernetes.io/projected/5930666f-c065-44ca-a66c-42d75ef8a0ef-kube-api-access-qwgx6\") pod \"test-operator-controller-manager-5c5cb9c4d7-kfh92\" (UID: \"5930666f-c065-44ca-a66c-42d75ef8a0ef\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.307129 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nszfl\" (UniqueName: \"kubernetes.io/projected/c84791cf-2ae6-4edd-b8b4-449995825ee7-kube-api-access-nszfl\") pod \"swift-operator-controller-manager-c674c5965-6s2qr\" (UID: \"c84791cf-2ae6-4edd-b8b4-449995825ee7\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.307163 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dbm\" (UniqueName: \"kubernetes.io/projected/35796358-732f-4ec4-88e0-f121b509a14c-kube-api-access-b4dbm\") pod \"telemetry-operator-controller-manager-d6b694c5-frmv4\" (UID: \"35796358-732f-4ec4-88e0-f121b509a14c\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.307218 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsch\" (UniqueName: \"kubernetes.io/projected/001e060b-cc07-4327-a02f-2a8a9c593aa3-kube-api-access-nqsch\") pod \"placement-operator-controller-manager-5784578c99-d8d62\" (UID: \"001e060b-cc07-4327-a02f-2a8a9c593aa3\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.308207 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.310343 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.326947 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.330509 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwgx6\" (UniqueName: \"kubernetes.io/projected/5930666f-c065-44ca-a66c-42d75ef8a0ef-kube-api-access-qwgx6\") pod \"test-operator-controller-manager-5c5cb9c4d7-kfh92\" (UID: \"5930666f-c065-44ca-a66c-42d75ef8a0ef\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.332523 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsch\" (UniqueName: \"kubernetes.io/projected/001e060b-cc07-4327-a02f-2a8a9c593aa3-kube-api-access-nqsch\") pod \"placement-operator-controller-manager-5784578c99-d8d62\" (UID: \"001e060b-cc07-4327-a02f-2a8a9c593aa3\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.337565 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.338659 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.340881 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mv4mx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.341049 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.342224 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nszfl\" (UniqueName: \"kubernetes.io/projected/c84791cf-2ae6-4edd-b8b4-449995825ee7-kube-api-access-nszfl\") pod \"swift-operator-controller-manager-c674c5965-6s2qr\" (UID: \"c84791cf-2ae6-4edd-b8b4-449995825ee7\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.345198 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.353744 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.354494 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dbm\" (UniqueName: \"kubernetes.io/projected/35796358-732f-4ec4-88e0-f121b509a14c-kube-api-access-b4dbm\") pod \"telemetry-operator-controller-manager-d6b694c5-frmv4\" (UID: \"35796358-732f-4ec4-88e0-f121b509a14c\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.383542 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.398407 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.407911 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xzmd\" (UniqueName: \"kubernetes.io/projected/0dfb9834-d621-4ae6-aedf-d9135d4e22cd-kube-api-access-2xzmd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-rlm4l\" (UID: \"0dfb9834-d621-4ae6-aedf-d9135d4e22cd\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.407959 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.407985 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.408027 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.408067 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh4zj\" (UniqueName: \"kubernetes.io/projected/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-kube-api-access-wh4zj\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.408198 4849 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.408240 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert podName:03c23473-4b4c-4e24-92f6-69363a9cf363 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:06.40822494 +0000 UTC m=+1016.085948335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert") pod "infra-operator-controller-manager-669fff9c7c-v8lc5" (UID: "03c23473-4b4c-4e24-92f6-69363a9cf363") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.428103 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.433877 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.474141 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.508169 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.509020 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.509071 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh4zj\" (UniqueName: \"kubernetes.io/projected/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-kube-api-access-wh4zj\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.509126 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xzmd\" (UniqueName: \"kubernetes.io/projected/0dfb9834-d621-4ae6-aedf-d9135d4e22cd-kube-api-access-2xzmd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-rlm4l\" (UID: \"0dfb9834-d621-4ae6-aedf-d9135d4e22cd\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.509176 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.509287 4849 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.509334 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:06.009320366 +0000 UTC m=+1015.687043761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "metrics-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.509625 4849 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.509702 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:06.009679535 +0000 UTC m=+1015.687403030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.532370 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh4zj\" (UniqueName: \"kubernetes.io/projected/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-kube-api-access-wh4zj\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.533234 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xzmd\" (UniqueName: \"kubernetes.io/projected/0dfb9834-d621-4ae6-aedf-d9135d4e22cd-kube-api-access-2xzmd\") pod \"watcher-operator-controller-manager-6c4d75f7f9-rlm4l\" (UID: \"0dfb9834-d621-4ae6-aedf-d9135d4e22cd\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.602774 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.707193 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.715300 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.715464 4849 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: E0320 13:41:05.715532 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert podName:4996a7a1-3666-436e-b366-7f32c73cee02 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:06.715511694 +0000 UTC m=+1016.393235089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" (UID: "4996a7a1-3666-436e-b366-7f32c73cee02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.729571 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.740611 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.850831 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf"] Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.864911 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm"] Mar 20 13:41:05 crc kubenswrapper[4849]: W0320 13:41:05.915480 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod956cd6f9_4828_4304_9b85_12025b56b9d5.slice/crio-471d56a5aec494afe16f64ace8ca8061912a06ef7420daa2c7c998d60df6d01e WatchSource:0}: Error finding container 471d56a5aec494afe16f64ace8ca8061912a06ef7420daa2c7c998d60df6d01e: Status 404 returned error can't find the container with id 471d56a5aec494afe16f64ace8ca8061912a06ef7420daa2c7c998d60df6d01e Mar 20 13:41:05 crc kubenswrapper[4849]: I0320 13:41:05.980221 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr"] Mar 20 13:41:05 crc kubenswrapper[4849]: W0320 13:41:05.981888 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fdffb87_786f_4c4e_88fd_b1dd7bcf728d.slice/crio-a3ac4bc4b9f6ea7eb65d7c00f211cf9b6cc76a7a4a2066070c027f9fb40dff0e WatchSource:0}: Error finding container a3ac4bc4b9f6ea7eb65d7c00f211cf9b6cc76a7a4a2066070c027f9fb40dff0e: Status 404 returned error can't find the container with id a3ac4bc4b9f6ea7eb65d7c00f211cf9b6cc76a7a4a2066070c027f9fb40dff0e Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:05.999992 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.005036 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94697f17_6007_4cd9_9eb6_04832d0e94c6.slice/crio-03e813e19cd3738208c7422bc0eadd9641cd1843c2ca192ccc22fe8c4a64df46 WatchSource:0}: Error finding container 03e813e19cd3738208c7422bc0eadd9641cd1843c2ca192ccc22fe8c4a64df46: Status 404 returned error can't find the container with id 03e813e19cd3738208c7422bc0eadd9641cd1843c2ca192ccc22fe8c4a64df46 Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.024460 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.024598 4849 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.024695 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.024853 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:07.024835942 +0000 UTC m=+1016.702559337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "metrics-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.024874 4849 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.025009 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:07.024999677 +0000 UTC m=+1016.702723072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "webhook-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.094355 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-b78d5"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.102938 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75598a7a_c554_416a_833a_5e2f1a40966e.slice/crio-3e474a15367f17832755c685fac76914fbe92495dadafca6762d9183ab619647 WatchSource:0}: Error finding container 3e474a15367f17832755c685fac76914fbe92495dadafca6762d9183ab619647: Status 404 returned error can't find the container with id 3e474a15367f17832755c685fac76914fbe92495dadafca6762d9183ab619647 Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.215310 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8"] Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.226875 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.234882 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb969620_248a_4a3d_9377_e61dd62a263a.slice/crio-9f40c9a0200e22a982ea1cf0e7a4be36d962f6911f757a1317918ab0dfa34e08 WatchSource:0}: Error finding container 9f40c9a0200e22a982ea1cf0e7a4be36d962f6911f757a1317918ab0dfa34e08: Status 404 returned error can't find the container with id 9f40c9a0200e22a982ea1cf0e7a4be36d962f6911f757a1317918ab0dfa34e08 Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.236954 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.236976 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95c179b_0dc5_4cea_98e7_7df754c3c0e2.slice/crio-b9c2481d8c4e7cbc7d7690c773d629bbafcbac4b436e2a31d6536ae21021ffbd WatchSource:0}: Error finding container b9c2481d8c4e7cbc7d7690c773d629bbafcbac4b436e2a31d6536ae21021ffbd: Status 404 returned error can't find the container with id b9c2481d8c4e7cbc7d7690c773d629bbafcbac4b436e2a31d6536ae21021ffbd Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.237465 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b2fadb_ac5a_4883_8f52_059f659844fb.slice/crio-193a68a8996eeec8a55311ef79a83d18959447aa18fec5d7619c6ed456ef854a WatchSource:0}: Error finding container 193a68a8996eeec8a55311ef79a83d18959447aa18fec5d7619c6ed456ef854a: Status 404 returned error can't find the container with id 193a68a8996eeec8a55311ef79a83d18959447aa18fec5d7619c6ed456ef854a Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.318009 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4"] Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.322686 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-d8d62"] Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.328302 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.329840 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5930666f_c065_44ca_a66c_42d75ef8a0ef.slice/crio-d13673e2a4b19954857fc9dfd5ce4186232b5a5689cbb6c694dff79679f50217 WatchSource:0}: Error finding container d13673e2a4b19954857fc9dfd5ce4186232b5a5689cbb6c694dff79679f50217: Status 404 returned error can't find the container with id d13673e2a4b19954857fc9dfd5ce4186232b5a5689cbb6c694dff79679f50217 Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.332349 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4dbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-frmv4_openstack-operators(35796358-732f-4ec4-88e0-f121b509a14c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.333420 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" podUID="35796358-732f-4ec4-88e0-f121b509a14c" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.333507 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.334129 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe62d445_815f_4606_8cca_aa13f732c509.slice/crio-8709265beda13262527d27ab1f917bb5483076a0aa38f1c6f30962ede6452202 WatchSource:0}: Error finding container 8709265beda13262527d27ab1f917bb5483076a0aa38f1c6f30962ede6452202: Status 404 returned error can't find the container with id 8709265beda13262527d27ab1f917bb5483076a0aa38f1c6f30962ede6452202 Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.335216 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qwgx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-kfh92_openstack-operators(5930666f-c065-44ca-a66c-42d75ef8a0ef): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.336317 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" podUID="5930666f-c065-44ca-a66c-42d75ef8a0ef" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.336373 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dn4rz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-m6v2j_openstack-operators(fe62d445-815f-4606-8cca-aa13f732c509): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.337577 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" podUID="fe62d445-815f-4606-8cca-aa13f732c509" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.433631 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.433918 4849 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.433991 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert podName:03c23473-4b4c-4e24-92f6-69363a9cf363 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:08.433959892 +0000 UTC m=+1018.111683287 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert") pod "infra-operator-controller-manager-669fff9c7c-v8lc5" (UID: "03c23473-4b4c-4e24-92f6-69363a9cf363") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.440151 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr"] Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.452644 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc84791cf_2ae6_4edd_b8b4_449995825ee7.slice/crio-012c0fbe91be2c41ca95bc4a331d5a9bd6a2940c54f97c3b2a854c5c404b7b81 WatchSource:0}: Error finding container 012c0fbe91be2c41ca95bc4a331d5a9bd6a2940c54f97c3b2a854c5c404b7b81: Status 404 returned error can't find the container with id 012c0fbe91be2c41ca95bc4a331d5a9bd6a2940c54f97c3b2a854c5c404b7b81 Mar 20 13:41:06 crc kubenswrapper[4849]: W0320 13:41:06.453525 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dfb9834_d621_4ae6_aedf_d9135d4e22cd.slice/crio-949943d97f423cb7f490e72b266cd7bbd5991291760db1b5127e3e0374728aae WatchSource:0}: Error finding container 949943d97f423cb7f490e72b266cd7bbd5991291760db1b5127e3e0374728aae: Status 404 returned error can't find the container with id 949943d97f423cb7f490e72b266cd7bbd5991291760db1b5127e3e0374728aae Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.454806 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l"] Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.456953 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nszfl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-6s2qr_openstack-operators(c84791cf-2ae6-4edd-b8b4-449995825ee7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.458052 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" podUID="c84791cf-2ae6-4edd-b8b4-449995825ee7" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.458168 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xzmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-rlm4l_openstack-operators(0dfb9834-d621-4ae6-aedf-d9135d4e22cd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.459930 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" podUID="0dfb9834-d621-4ae6-aedf-d9135d4e22cd" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.533924 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" event={"ID":"b186179d-3d3c-4cd1-806b-d7d8682ac88f","Type":"ContainerStarted","Data":"48567b7c5b9d51d92db99ee871795f5167d7801b4cc989c3c1631ee8b22f0266"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.536501 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" event={"ID":"f95c179b-0dc5-4cea-98e7-7df754c3c0e2","Type":"ContainerStarted","Data":"b9c2481d8c4e7cbc7d7690c773d629bbafcbac4b436e2a31d6536ae21021ffbd"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.537913 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" event={"ID":"89f24131-b326-437f-8d55-ccc77b120d8a","Type":"ContainerStarted","Data":"88708ded191df16c59c8da9eb869e894b67d73bd9302c040c6a720630060b495"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.541796 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" event={"ID":"d8b2fadb-ac5a-4883-8f52-059f659844fb","Type":"ContainerStarted","Data":"193a68a8996eeec8a55311ef79a83d18959447aa18fec5d7619c6ed456ef854a"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.544151 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" event={"ID":"fe62d445-815f-4606-8cca-aa13f732c509","Type":"ContainerStarted","Data":"8709265beda13262527d27ab1f917bb5483076a0aa38f1c6f30962ede6452202"} Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.545374 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" podUID="fe62d445-815f-4606-8cca-aa13f732c509" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.545937 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" event={"ID":"3fdffb87-786f-4c4e-88fd-b1dd7bcf728d","Type":"ContainerStarted","Data":"a3ac4bc4b9f6ea7eb65d7c00f211cf9b6cc76a7a4a2066070c027f9fb40dff0e"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.547025 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" event={"ID":"001e060b-cc07-4327-a02f-2a8a9c593aa3","Type":"ContainerStarted","Data":"d6a961147b8545f4ae5d3204bbc653a028a22b2e94ece15f41c344a1a2d12b8f"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.548683 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" event={"ID":"94697f17-6007-4cd9-9eb6-04832d0e94c6","Type":"ContainerStarted","Data":"03e813e19cd3738208c7422bc0eadd9641cd1843c2ca192ccc22fe8c4a64df46"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.550243 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" event={"ID":"35796358-732f-4ec4-88e0-f121b509a14c","Type":"ContainerStarted","Data":"948c3bb377c5f8388077c17478503347475531053610e4a60ccc7b9431794cf3"} Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.551590 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" podUID="35796358-732f-4ec4-88e0-f121b509a14c" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.551830 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" event={"ID":"377c37d3-9285-44fb-bcd7-1dba905a3133","Type":"ContainerStarted","Data":"fc304a181378a0d9081271ec24e686facd0fcf8f63cc9dec31cd27dd611ace94"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.553274 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" event={"ID":"0dfb9834-d621-4ae6-aedf-d9135d4e22cd","Type":"ContainerStarted","Data":"949943d97f423cb7f490e72b266cd7bbd5991291760db1b5127e3e0374728aae"} Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.554982 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" podUID="0dfb9834-d621-4ae6-aedf-d9135d4e22cd" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.555497 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" event={"ID":"75598a7a-c554-416a-833a-5e2f1a40966e","Type":"ContainerStarted","Data":"3e474a15367f17832755c685fac76914fbe92495dadafca6762d9183ab619647"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.558847 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" event={"ID":"485ab391-8811-4d32-a7ce-de1f2c0cd1e5","Type":"ContainerStarted","Data":"ae1efedef58ce344f84a1c5095f40fbbd03ac8aba6398cc2b92e620f7620f9a0"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.562039 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" event={"ID":"eb969620-248a-4a3d-9377-e61dd62a263a","Type":"ContainerStarted","Data":"9f40c9a0200e22a982ea1cf0e7a4be36d962f6911f757a1317918ab0dfa34e08"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.568145 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" event={"ID":"5930666f-c065-44ca-a66c-42d75ef8a0ef","Type":"ContainerStarted","Data":"d13673e2a4b19954857fc9dfd5ce4186232b5a5689cbb6c694dff79679f50217"} Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.570529 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" podUID="5930666f-c065-44ca-a66c-42d75ef8a0ef" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.572871 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" event={"ID":"5345f1a2-c4af-46ca-b53a-acbb0cbcec04","Type":"ContainerStarted","Data":"72ae3c7efdb860031cc72c7ab8de8a34b934a9cfe5ad7915bea3a8b3bb7abcfa"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.576578 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" event={"ID":"956cd6f9-4828-4304-9b85-12025b56b9d5","Type":"ContainerStarted","Data":"471d56a5aec494afe16f64ace8ca8061912a06ef7420daa2c7c998d60df6d01e"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.577531 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" event={"ID":"bb530eb5-4963-4790-89f6-e21f33d2b254","Type":"ContainerStarted","Data":"652c485b95426a6414892dabfff367273cdc4177771720409d0bdbcfe61dd8ea"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.578301 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" event={"ID":"c84791cf-2ae6-4edd-b8b4-449995825ee7","Type":"ContainerStarted","Data":"012c0fbe91be2c41ca95bc4a331d5a9bd6a2940c54f97c3b2a854c5c404b7b81"} Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.580329 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" podUID="c84791cf-2ae6-4edd-b8b4-449995825ee7" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.590261 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerStarted","Data":"2bc305f8061f2df8e5a1fc41cfc297c0ed702834345e5f6b22713ce757f73ea4"} Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.663455 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b2ccv" podStartSLOduration=3.077296511 podStartE2EDuration="5.663438192s" podCreationTimestamp="2026-03-20 13:41:01 +0000 UTC" firstStartedPulling="2026-03-20 13:41:02.47450822 +0000 UTC m=+1012.152231615" lastFinishedPulling="2026-03-20 13:41:05.060649901 +0000 UTC m=+1014.738373296" observedRunningTime="2026-03-20 13:41:06.653726793 +0000 UTC m=+1016.331450178" watchObservedRunningTime="2026-03-20 13:41:06.663438192 +0000 UTC m=+1016.341161587" Mar 20 13:41:06 crc kubenswrapper[4849]: I0320 13:41:06.738238 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.738456 4849 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:06 crc kubenswrapper[4849]: E0320 13:41:06.738542 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert podName:4996a7a1-3666-436e-b366-7f32c73cee02 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:08.738518594 +0000 UTC m=+1018.416242029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" (UID: "4996a7a1-3666-436e-b366-7f32c73cee02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:07 crc kubenswrapper[4849]: I0320 13:41:07.041550 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.041714 4849 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.041766 4849 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.041780 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:09.04176403 +0000 UTC m=+1018.719487425 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "metrics-server-cert" not found Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.041842 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:09.041803371 +0000 UTC m=+1018.719526766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "webhook-server-cert" not found Mar 20 13:41:07 crc kubenswrapper[4849]: I0320 13:41:07.042909 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.599205 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" podUID="5930666f-c065-44ca-a66c-42d75ef8a0ef" Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.599237 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" podUID="fe62d445-815f-4606-8cca-aa13f732c509" Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.599498 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" podUID="c84791cf-2ae6-4edd-b8b4-449995825ee7" Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.599549 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" podUID="35796358-732f-4ec4-88e0-f121b509a14c" Mar 20 13:41:07 crc kubenswrapper[4849]: E0320 13:41:07.599570 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" podUID="0dfb9834-d621-4ae6-aedf-d9135d4e22cd" Mar 20 13:41:08 crc kubenswrapper[4849]: I0320 13:41:08.465304 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:08 crc kubenswrapper[4849]: E0320 13:41:08.465504 4849 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:08 crc kubenswrapper[4849]: E0320 13:41:08.465687 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert podName:03c23473-4b4c-4e24-92f6-69363a9cf363 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:12.465670539 +0000 UTC m=+1022.143393924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert") pod "infra-operator-controller-manager-669fff9c7c-v8lc5" (UID: "03c23473-4b4c-4e24-92f6-69363a9cf363") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:08 crc kubenswrapper[4849]: I0320 13:41:08.768225 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:08 crc kubenswrapper[4849]: E0320 13:41:08.768355 4849 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:08 crc kubenswrapper[4849]: E0320 13:41:08.768403 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert podName:4996a7a1-3666-436e-b366-7f32c73cee02 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:12.768390381 +0000 UTC m=+1022.446113776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" (UID: "4996a7a1-3666-436e-b366-7f32c73cee02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.072606 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.072664 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:09 crc kubenswrapper[4849]: E0320 13:41:09.072765 4849 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:41:09 crc kubenswrapper[4849]: E0320 13:41:09.072783 4849 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:41:09 crc kubenswrapper[4849]: E0320 13:41:09.072815 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:13.072801849 +0000 UTC m=+1022.750525244 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "webhook-server-cert" not found Mar 20 13:41:09 crc kubenswrapper[4849]: E0320 13:41:09.073052 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:13.073029495 +0000 UTC m=+1022.750752900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "metrics-server-cert" not found Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.283851 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hrjcc"] Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.286284 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.321789 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hrjcc"] Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.379688 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqh8p\" (UniqueName: \"kubernetes.io/projected/d9ea786d-a24c-4b9c-9949-8a763f3be064-kube-api-access-sqh8p\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.379774 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-catalog-content\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.379844 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-utilities\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.384030 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.384083 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.480941 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqh8p\" (UniqueName: \"kubernetes.io/projected/d9ea786d-a24c-4b9c-9949-8a763f3be064-kube-api-access-sqh8p\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.481011 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-catalog-content\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.481047 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-utilities\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.481532 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-catalog-content\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.481560 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-utilities\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.504541 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqh8p\" (UniqueName: \"kubernetes.io/projected/d9ea786d-a24c-4b9c-9949-8a763f3be064-kube-api-access-sqh8p\") pod \"community-operators-hrjcc\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:09 crc kubenswrapper[4849]: I0320 13:41:09.611150 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:11 crc kubenswrapper[4849]: I0320 13:41:11.658360 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:11 crc kubenswrapper[4849]: I0320 13:41:11.658744 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:11 crc kubenswrapper[4849]: I0320 13:41:11.731085 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:12 crc kubenswrapper[4849]: I0320 13:41:12.523154 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:12 crc kubenswrapper[4849]: E0320 13:41:12.523377 4849 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:12 crc kubenswrapper[4849]: E0320 13:41:12.523469 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert podName:03c23473-4b4c-4e24-92f6-69363a9cf363 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:20.523448003 +0000 UTC m=+1030.201171468 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert") pod "infra-operator-controller-manager-669fff9c7c-v8lc5" (UID: "03c23473-4b4c-4e24-92f6-69363a9cf363") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:12 crc kubenswrapper[4849]: I0320 13:41:12.669679 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:12 crc kubenswrapper[4849]: I0320 13:41:12.828851 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:12 crc kubenswrapper[4849]: E0320 13:41:12.829054 4849 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:12 crc kubenswrapper[4849]: E0320 13:41:12.829154 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert podName:4996a7a1-3666-436e-b366-7f32c73cee02 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:20.829132525 +0000 UTC m=+1030.506855920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" (UID: "4996a7a1-3666-436e-b366-7f32c73cee02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:41:13 crc kubenswrapper[4849]: I0320 13:41:13.133020 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:13 crc kubenswrapper[4849]: I0320 13:41:13.133084 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:13 crc kubenswrapper[4849]: E0320 13:41:13.133255 4849 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:41:13 crc kubenswrapper[4849]: E0320 13:41:13.133309 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:21.133295986 +0000 UTC m=+1030.811019371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "webhook-server-cert" not found Mar 20 13:41:13 crc kubenswrapper[4849]: E0320 13:41:13.133393 4849 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:41:13 crc kubenswrapper[4849]: E0320 13:41:13.133571 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs podName:7c9c4158-3ca1-4c9f-8fef-43a35bbff88b nodeName:}" failed. No retries permitted until 2026-03-20 13:41:21.133525942 +0000 UTC m=+1030.811249537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-jhdbx" (UID: "7c9c4158-3ca1-4c9f-8fef-43a35bbff88b") : secret "metrics-server-cert" not found Mar 20 13:41:13 crc kubenswrapper[4849]: I0320 13:41:13.663897 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2ccv"] Mar 20 13:41:14 crc kubenswrapper[4849]: I0320 13:41:14.642438 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b2ccv" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="registry-server" containerID="cri-o://2bc305f8061f2df8e5a1fc41cfc297c0ed702834345e5f6b22713ce757f73ea4" gracePeriod=2 Mar 20 13:41:15 crc kubenswrapper[4849]: I0320 13:41:15.652552 4849 generic.go:334] "Generic (PLEG): container finished" podID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerID="2bc305f8061f2df8e5a1fc41cfc297c0ed702834345e5f6b22713ce757f73ea4" exitCode=0 Mar 20 13:41:15 crc kubenswrapper[4849]: I0320 13:41:15.652750 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerDied","Data":"2bc305f8061f2df8e5a1fc41cfc297c0ed702834345e5f6b22713ce757f73ea4"} Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.076626 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h7h5h"] Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.078433 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.081882 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7h5h"] Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.190176 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-catalog-content\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.190328 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2cj\" (UniqueName: \"kubernetes.io/projected/0615834a-954f-4364-9c31-438c573df61a-kube-api-access-kw2cj\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.190480 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-utilities\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.291526 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-catalog-content\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.292242 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-catalog-content\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.292406 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw2cj\" (UniqueName: \"kubernetes.io/projected/0615834a-954f-4364-9c31-438c573df61a-kube-api-access-kw2cj\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.292450 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-utilities\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.292714 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-utilities\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.312473 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw2cj\" (UniqueName: \"kubernetes.io/projected/0615834a-954f-4364-9c31-438c573df61a-kube-api-access-kw2cj\") pod \"redhat-marketplace-h7h5h\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:16 crc kubenswrapper[4849]: I0320 13:41:16.398195 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:20 crc kubenswrapper[4849]: E0320 13:41:20.087372 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1" Mar 20 13:41:20 crc kubenswrapper[4849]: E0320 13:41:20.088091 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ldsvg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-drbhr_openstack-operators(3fdffb87-786f-4c4e-88fd-b1dd7bcf728d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:41:20 crc kubenswrapper[4849]: E0320 13:41:20.089297 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" podUID="3fdffb87-786f-4c4e-88fd-b1dd7bcf728d" Mar 20 13:41:20 crc kubenswrapper[4849]: I0320 13:41:20.562049 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:20 crc kubenswrapper[4849]: E0320 13:41:20.562428 4849 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:20 crc kubenswrapper[4849]: E0320 13:41:20.562571 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert podName:03c23473-4b4c-4e24-92f6-69363a9cf363 nodeName:}" failed. No retries permitted until 2026-03-20 13:41:36.562525703 +0000 UTC m=+1046.240249108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert") pod "infra-operator-controller-manager-669fff9c7c-v8lc5" (UID: "03c23473-4b4c-4e24-92f6-69363a9cf363") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:41:20 crc kubenswrapper[4849]: E0320 13:41:20.686484 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" podUID="3fdffb87-786f-4c4e-88fd-b1dd7bcf728d" Mar 20 13:41:20 crc kubenswrapper[4849]: I0320 13:41:20.876092 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:20 crc kubenswrapper[4849]: I0320 13:41:20.882150 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4996a7a1-3666-436e-b366-7f32c73cee02-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5zg2mh\" (UID: \"4996a7a1-3666-436e-b366-7f32c73cee02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:20 crc kubenswrapper[4849]: I0320 13:41:20.945391 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9t7hz" Mar 20 13:41:20 crc kubenswrapper[4849]: I0320 13:41:20.953931 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.179580 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.179950 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.184259 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.185655 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c9c4158-3ca1-4c9f-8fef-43a35bbff88b-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-jhdbx\" (UID: \"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.261748 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.286505 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mv4mx" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.295440 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.382400 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t24s7\" (UniqueName: \"kubernetes.io/projected/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-kube-api-access-t24s7\") pod \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.382647 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-utilities\") pod \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.382777 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-catalog-content\") pod \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\" (UID: \"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1\") " Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.383908 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-utilities" (OuterVolumeSpecName: "utilities") pod "79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" (UID: "79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.386400 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-kube-api-access-t24s7" (OuterVolumeSpecName: "kube-api-access-t24s7") pod "79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" (UID: "79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1"). InnerVolumeSpecName "kube-api-access-t24s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.431566 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" (UID: "79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.484421 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.484452 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t24s7\" (UniqueName: \"kubernetes.io/projected/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-kube-api-access-t24s7\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.484463 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.695639 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2ccv" event={"ID":"79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1","Type":"ContainerDied","Data":"810f90eccf7567ae83648f9dff9c4434169acc34c14ec04a495a86618e67d4ab"} Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.695690 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2ccv" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.695702 4849 scope.go:117] "RemoveContainer" containerID="2bc305f8061f2df8e5a1fc41cfc297c0ed702834345e5f6b22713ce757f73ea4" Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.729667 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b2ccv"] Mar 20 13:41:21 crc kubenswrapper[4849]: I0320 13:41:21.735500 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b2ccv"] Mar 20 13:41:23 crc kubenswrapper[4849]: I0320 13:41:23.043328 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" path="/var/lib/kubelet/pods/79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1/volumes" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.686768 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.687254 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jd9fp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-z2xd8_openstack-operators(f95c179b-0dc5-4cea-98e7-7df754c3c0e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.688412 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" podUID="f95c179b-0dc5-4cea-98e7-7df754c3c0e2" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.722016 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" podUID="f95c179b-0dc5-4cea-98e7-7df754c3c0e2" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.935087 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.935240 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7j5sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-vvscl_openstack-operators(94697f17-6007-4cd9-9eb6-04832d0e94c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:41:25 crc kubenswrapper[4849]: E0320 13:41:25.936412 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" podUID="94697f17-6007-4cd9-9eb6-04832d0e94c6" Mar 20 13:41:26 crc kubenswrapper[4849]: I0320 13:41:26.636122 4849 scope.go:117] "RemoveContainer" containerID="797d929560363fa5741c6e5a4d986ee56b4452dfbd37b4176a9cc4a3e59d6a3d" Mar 20 13:41:26 crc kubenswrapper[4849]: I0320 13:41:26.717057 4849 scope.go:117] "RemoveContainer" containerID="c9f1822ca66ce840bc0b9d99c9142dfa59008e876345aa2e04fbcc09a832630c" Mar 20 13:41:26 crc kubenswrapper[4849]: E0320 13:41:26.735047 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" podUID="94697f17-6007-4cd9-9eb6-04832d0e94c6" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.140568 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh"] Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.183764 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7h5h"] Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.202442 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hrjcc"] Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.265230 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx"] Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.739408 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" event={"ID":"75598a7a-c554-416a-833a-5e2f1a40966e","Type":"ContainerStarted","Data":"55089a60aae2bc9f2a857389ed7b656b39343cefd2db7536a1059709ccd8f14a"} Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.739479 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.740616 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" event={"ID":"485ab391-8811-4d32-a7ce-de1f2c0cd1e5","Type":"ContainerStarted","Data":"ac8e8b45f933c1bf0bc334a653adb22e696a07af34deec13dd453074b97970a3"} Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.740780 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.742764 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" event={"ID":"89f24131-b326-437f-8d55-ccc77b120d8a","Type":"ContainerStarted","Data":"71a264a2c36f3e01beb5a3b69a5768e60ac425d1504d241e493c1fe309322c92"} Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.742865 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.745461 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" event={"ID":"001e060b-cc07-4327-a02f-2a8a9c593aa3","Type":"ContainerStarted","Data":"d34deea4b297fc70de7f3508c4258cf750a3f1d95006b4e827c20256153867af"} Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.745508 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.747983 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" event={"ID":"b186179d-3d3c-4cd1-806b-d7d8682ac88f","Type":"ContainerStarted","Data":"bffadb6386977e796e5595628468dcf160a02bbb6a14fd9f5720261a87fd6be9"} Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.748598 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.759574 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" podStartSLOduration=3.943058943 podStartE2EDuration="23.759548458s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.105066872 +0000 UTC m=+1015.782790257" lastFinishedPulling="2026-03-20 13:41:25.921556377 +0000 UTC m=+1035.599279772" observedRunningTime="2026-03-20 13:41:27.752588492 +0000 UTC m=+1037.430311907" watchObservedRunningTime="2026-03-20 13:41:27.759548458 +0000 UTC m=+1037.437271853" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.771733 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" podStartSLOduration=3.613020351 podStartE2EDuration="23.771713472s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.765928498 +0000 UTC m=+1015.443651893" lastFinishedPulling="2026-03-20 13:41:25.924621619 +0000 UTC m=+1035.602345014" observedRunningTime="2026-03-20 13:41:27.765191528 +0000 UTC m=+1037.442914923" watchObservedRunningTime="2026-03-20 13:41:27.771713472 +0000 UTC m=+1037.449436867" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.783768 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" podStartSLOduration=3.64257628 podStartE2EDuration="23.783749053s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.780750644 +0000 UTC m=+1015.458474039" lastFinishedPulling="2026-03-20 13:41:25.921923417 +0000 UTC m=+1035.599646812" observedRunningTime="2026-03-20 13:41:27.779188701 +0000 UTC m=+1037.456912126" watchObservedRunningTime="2026-03-20 13:41:27.783749053 +0000 UTC m=+1037.461472448" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.804444 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" podStartSLOduration=4.121300035 podStartE2EDuration="23.804424614s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.516639691 +0000 UTC m=+1015.194363086" lastFinishedPulling="2026-03-20 13:41:25.19976426 +0000 UTC m=+1034.877487665" observedRunningTime="2026-03-20 13:41:27.800312755 +0000 UTC m=+1037.478036180" watchObservedRunningTime="2026-03-20 13:41:27.804424614 +0000 UTC m=+1037.482148009" Mar 20 13:41:27 crc kubenswrapper[4849]: I0320 13:41:27.828471 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" podStartSLOduration=4.236836056 podStartE2EDuration="23.828452005s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.329493526 +0000 UTC m=+1016.007216921" lastFinishedPulling="2026-03-20 13:41:25.921109475 +0000 UTC m=+1035.598832870" observedRunningTime="2026-03-20 13:41:27.827272923 +0000 UTC m=+1037.504996328" watchObservedRunningTime="2026-03-20 13:41:27.828452005 +0000 UTC m=+1037.506175400" Mar 20 13:41:28 crc kubenswrapper[4849]: W0320 13:41:28.054399 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9c4158_3ca1_4c9f_8fef_43a35bbff88b.slice/crio-d26502ea80a6ca55811851a7e662c6d7702245c9df8673b470ad94044afbe7d3 WatchSource:0}: Error finding container d26502ea80a6ca55811851a7e662c6d7702245c9df8673b470ad94044afbe7d3: Status 404 returned error can't find the container with id d26502ea80a6ca55811851a7e662c6d7702245c9df8673b470ad94044afbe7d3 Mar 20 13:41:28 crc kubenswrapper[4849]: W0320 13:41:28.054797 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4996a7a1_3666_436e_b366_7f32c73cee02.slice/crio-6a353fabef3693da2b9d94d8db69e9ca5472f788e3b578bb11722dc133d58189 WatchSource:0}: Error finding container 6a353fabef3693da2b9d94d8db69e9ca5472f788e3b578bb11722dc133d58189: Status 404 returned error can't find the container with id 6a353fabef3693da2b9d94d8db69e9ca5472f788e3b578bb11722dc133d58189 Mar 20 13:41:28 crc kubenswrapper[4849]: W0320 13:41:28.055623 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0615834a_954f_4364_9c31_438c573df61a.slice/crio-a45a95baba7582b0217bb96be8c377e659ee6f32d2f1b729bc2794f0d8c882ec WatchSource:0}: Error finding container a45a95baba7582b0217bb96be8c377e659ee6f32d2f1b729bc2794f0d8c882ec: Status 404 returned error can't find the container with id a45a95baba7582b0217bb96be8c377e659ee6f32d2f1b729bc2794f0d8c882ec Mar 20 13:41:28 crc kubenswrapper[4849]: W0320 13:41:28.057734 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9ea786d_a24c_4b9c_9949_8a763f3be064.slice/crio-5e01fbdd502c5841603aff5aa17c5bf9699f75057352d61dd9aed73a729dca65 WatchSource:0}: Error finding container 5e01fbdd502c5841603aff5aa17c5bf9699f75057352d61dd9aed73a729dca65: Status 404 returned error can't find the container with id 5e01fbdd502c5841603aff5aa17c5bf9699f75057352d61dd9aed73a729dca65 Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.762343 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" event={"ID":"5345f1a2-c4af-46ca-b53a-acbb0cbcec04","Type":"ContainerStarted","Data":"a03c35fe45da0a7f4b6c30e117b8b9fd8337fff9cb82a583e9ebf1a975edcdb0"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.763129 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.772656 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" event={"ID":"956cd6f9-4828-4304-9b85-12025b56b9d5","Type":"ContainerStarted","Data":"d2a7586b5b3a76c106aa322b1f0ac8d6a95aeb30ab7979f6570438fea6e9917e"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.772794 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.778065 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" event={"ID":"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b","Type":"ContainerStarted","Data":"d26502ea80a6ca55811851a7e662c6d7702245c9df8673b470ad94044afbe7d3"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.782600 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" event={"ID":"eb969620-248a-4a3d-9377-e61dd62a263a","Type":"ContainerStarted","Data":"035b9ac8a9b066dfd86f63c5ff87255fd60cefd2460ea7e6965193e83c3cf31d"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.783350 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.785129 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" event={"ID":"bb530eb5-4963-4790-89f6-e21f33d2b254","Type":"ContainerStarted","Data":"b3539c4eec2ad0992486a11a25bb7c7cbd8f9ea5c461f11388d4b08be753ee1f"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.785459 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.785473 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" podStartSLOduration=4.779208689 podStartE2EDuration="24.785463165s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.915546378 +0000 UTC m=+1015.593269773" lastFinishedPulling="2026-03-20 13:41:25.921800854 +0000 UTC m=+1035.599524249" observedRunningTime="2026-03-20 13:41:28.785109675 +0000 UTC m=+1038.462833080" watchObservedRunningTime="2026-03-20 13:41:28.785463165 +0000 UTC m=+1038.463186560" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.786178 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7h5h" event={"ID":"0615834a-954f-4364-9c31-438c573df61a","Type":"ContainerStarted","Data":"a45a95baba7582b0217bb96be8c377e659ee6f32d2f1b729bc2794f0d8c882ec"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.793276 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" event={"ID":"377c37d3-9285-44fb-bcd7-1dba905a3133","Type":"ContainerStarted","Data":"06c0b8eacb39fd04fbbc46c78d98f53257b6965d98dc4eaa51324675f396ac53"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.793700 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.804949 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" event={"ID":"d8b2fadb-ac5a-4883-8f52-059f659844fb","Type":"ContainerStarted","Data":"1d61ef92078d391c06528c31579758ae8630aeb4f2af310a99003fe237821937"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.805450 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.809444 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrjcc" event={"ID":"d9ea786d-a24c-4b9c-9949-8a763f3be064","Type":"ContainerStarted","Data":"5e01fbdd502c5841603aff5aa17c5bf9699f75057352d61dd9aed73a729dca65"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.811054 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" event={"ID":"4996a7a1-3666-436e-b366-7f32c73cee02","Type":"ContainerStarted","Data":"6a353fabef3693da2b9d94d8db69e9ca5472f788e3b578bb11722dc133d58189"} Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.811163 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" podStartSLOduration=4.8153707 podStartE2EDuration="24.811149527s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.926023077 +0000 UTC m=+1015.603746472" lastFinishedPulling="2026-03-20 13:41:25.921801904 +0000 UTC m=+1035.599525299" observedRunningTime="2026-03-20 13:41:28.808482454 +0000 UTC m=+1038.486205859" watchObservedRunningTime="2026-03-20 13:41:28.811149527 +0000 UTC m=+1038.488872912" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.851420 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" podStartSLOduration=5.167188139 podStartE2EDuration="24.851406057s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.237341069 +0000 UTC m=+1015.915064464" lastFinishedPulling="2026-03-20 13:41:25.921558987 +0000 UTC m=+1035.599282382" observedRunningTime="2026-03-20 13:41:28.849564307 +0000 UTC m=+1038.527287692" watchObservedRunningTime="2026-03-20 13:41:28.851406057 +0000 UTC m=+1038.529129452" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.879271 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" podStartSLOduration=4.716284294 podStartE2EDuration="24.879248748s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.761450209 +0000 UTC m=+1015.439173604" lastFinishedPulling="2026-03-20 13:41:25.924414663 +0000 UTC m=+1035.602138058" observedRunningTime="2026-03-20 13:41:28.870441877 +0000 UTC m=+1038.548165272" watchObservedRunningTime="2026-03-20 13:41:28.879248748 +0000 UTC m=+1038.556972143" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.929348 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" podStartSLOduration=4.530793711 podStartE2EDuration="24.929330787s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.523032161 +0000 UTC m=+1015.200755556" lastFinishedPulling="2026-03-20 13:41:25.921569237 +0000 UTC m=+1035.599292632" observedRunningTime="2026-03-20 13:41:28.914994205 +0000 UTC m=+1038.592717600" watchObservedRunningTime="2026-03-20 13:41:28.929330787 +0000 UTC m=+1038.607054182" Mar 20 13:41:28 crc kubenswrapper[4849]: I0320 13:41:28.944769 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" podStartSLOduration=5.268625686 podStartE2EDuration="24.944752128s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.245493427 +0000 UTC m=+1015.923216822" lastFinishedPulling="2026-03-20 13:41:25.921619879 +0000 UTC m=+1035.599343264" observedRunningTime="2026-03-20 13:41:28.940583354 +0000 UTC m=+1038.618306749" watchObservedRunningTime="2026-03-20 13:41:28.944752128 +0000 UTC m=+1038.622475523" Mar 20 13:41:30 crc kubenswrapper[4849]: I0320 13:41:30.826679 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" event={"ID":"0dfb9834-d621-4ae6-aedf-d9135d4e22cd","Type":"ContainerStarted","Data":"c24e4da68fccc4359643d54605e12b9f6cf28182df7aab31a1cd5695bb902a9d"} Mar 20 13:41:30 crc kubenswrapper[4849]: I0320 13:41:30.827227 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:30 crc kubenswrapper[4849]: I0320 13:41:30.830883 4849 generic.go:334] "Generic (PLEG): container finished" podID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerID="13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d" exitCode=0 Mar 20 13:41:30 crc kubenswrapper[4849]: I0320 13:41:30.830952 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrjcc" event={"ID":"d9ea786d-a24c-4b9c-9949-8a763f3be064","Type":"ContainerDied","Data":"13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d"} Mar 20 13:41:30 crc kubenswrapper[4849]: I0320 13:41:30.844013 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" podStartSLOduration=5.584031278 podStartE2EDuration="25.843998518s" podCreationTimestamp="2026-03-20 13:41:05 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.458105336 +0000 UTC m=+1016.135828731" lastFinishedPulling="2026-03-20 13:41:26.718072576 +0000 UTC m=+1036.395795971" observedRunningTime="2026-03-20 13:41:30.841760147 +0000 UTC m=+1040.519483542" watchObservedRunningTime="2026-03-20 13:41:30.843998518 +0000 UTC m=+1040.521721913" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.839676 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" event={"ID":"fe62d445-815f-4606-8cca-aa13f732c509","Type":"ContainerStarted","Data":"f8beac8fd4a28f0dbdcf2a60e94820f28a409d403e693ed7ee665901a000ffc0"} Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.840059 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.841202 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" event={"ID":"c84791cf-2ae6-4edd-b8b4-449995825ee7","Type":"ContainerStarted","Data":"ad6e0c08e38252cdc85e6a41d07c0e62adfde49554c60d076a1004f05fa9411a"} Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.841400 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.842587 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" event={"ID":"5930666f-c065-44ca-a66c-42d75ef8a0ef","Type":"ContainerStarted","Data":"be5b51f78fee21702e1666aef012c7cd3e333d946214dea49a4457c8714cde41"} Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.842760 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.844761 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" event={"ID":"35796358-732f-4ec4-88e0-f121b509a14c","Type":"ContainerStarted","Data":"c102e4ba6f193a09aa29691a5c69700d14376baedb1f8608609118e769b3d760"} Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.845066 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.848467 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" event={"ID":"7c9c4158-3ca1-4c9f-8fef-43a35bbff88b","Type":"ContainerStarted","Data":"eb767820e37a58331d0c4a2dafc11d99462c3adb43df62c060c385769d261bab"} Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.849495 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.857039 4849 generic.go:334] "Generic (PLEG): container finished" podID="0615834a-954f-4364-9c31-438c573df61a" containerID="d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10" exitCode=0 Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.857467 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7h5h" event={"ID":"0615834a-954f-4364-9c31-438c573df61a","Type":"ContainerDied","Data":"d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10"} Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.871543 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" podStartSLOduration=3.675160166 podStartE2EDuration="27.871522928s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.336275307 +0000 UTC m=+1016.013998702" lastFinishedPulling="2026-03-20 13:41:30.532638049 +0000 UTC m=+1040.210361464" observedRunningTime="2026-03-20 13:41:31.855275324 +0000 UTC m=+1041.532998719" watchObservedRunningTime="2026-03-20 13:41:31.871522928 +0000 UTC m=+1041.549246323" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.872070 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" podStartSLOduration=3.798710633 podStartE2EDuration="27.872064193s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.456680048 +0000 UTC m=+1016.134403443" lastFinishedPulling="2026-03-20 13:41:30.530033608 +0000 UTC m=+1040.207757003" observedRunningTime="2026-03-20 13:41:31.86903249 +0000 UTC m=+1041.546755895" watchObservedRunningTime="2026-03-20 13:41:31.872064193 +0000 UTC m=+1041.549787588" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.885277 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" podStartSLOduration=2.672202896 podStartE2EDuration="26.885257243s" podCreationTimestamp="2026-03-20 13:41:05 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.335100246 +0000 UTC m=+1016.012823641" lastFinishedPulling="2026-03-20 13:41:30.548154593 +0000 UTC m=+1040.225877988" observedRunningTime="2026-03-20 13:41:31.883912706 +0000 UTC m=+1041.561636121" watchObservedRunningTime="2026-03-20 13:41:31.885257243 +0000 UTC m=+1041.562980638" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.919775 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" podStartSLOduration=26.919750096 podStartE2EDuration="26.919750096s" podCreationTimestamp="2026-03-20 13:41:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:31.915264533 +0000 UTC m=+1041.592987928" watchObservedRunningTime="2026-03-20 13:41:31.919750096 +0000 UTC m=+1041.597473501" Mar 20 13:41:31 crc kubenswrapper[4849]: I0320 13:41:31.932968 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" podStartSLOduration=3.7520446290000002 podStartE2EDuration="27.932951866s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.33224855 +0000 UTC m=+1016.009971945" lastFinishedPulling="2026-03-20 13:41:30.513155787 +0000 UTC m=+1040.190879182" observedRunningTime="2026-03-20 13:41:31.932583206 +0000 UTC m=+1041.610306611" watchObservedRunningTime="2026-03-20 13:41:31.932951866 +0000 UTC m=+1041.610675261" Mar 20 13:41:32 crc kubenswrapper[4849]: E0320 13:41:32.829214 4849 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9ea786d_a24c_4b9c_9949_8a763f3be064.slice/crio-conmon-2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9ea786d_a24c_4b9c_9949_8a763f3be064.slice/crio-2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:41:32 crc kubenswrapper[4849]: I0320 13:41:32.866006 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" event={"ID":"4996a7a1-3666-436e-b366-7f32c73cee02","Type":"ContainerStarted","Data":"d6fb0da10ed547e84ef10c5a2863f90187dcf5ee69a477091b379378b5e8706c"} Mar 20 13:41:32 crc kubenswrapper[4849]: I0320 13:41:32.866074 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:32 crc kubenswrapper[4849]: I0320 13:41:32.868986 4849 generic.go:334] "Generic (PLEG): container finished" podID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerID="2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd" exitCode=0 Mar 20 13:41:32 crc kubenswrapper[4849]: I0320 13:41:32.869065 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrjcc" event={"ID":"d9ea786d-a24c-4b9c-9949-8a763f3be064","Type":"ContainerDied","Data":"2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd"} Mar 20 13:41:32 crc kubenswrapper[4849]: I0320 13:41:32.899102 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" podStartSLOduration=24.656256143 podStartE2EDuration="28.899085938s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:28.074706681 +0000 UTC m=+1037.752430076" lastFinishedPulling="2026-03-20 13:41:32.317536486 +0000 UTC m=+1041.995259871" observedRunningTime="2026-03-20 13:41:32.887501202 +0000 UTC m=+1042.565224597" watchObservedRunningTime="2026-03-20 13:41:32.899085938 +0000 UTC m=+1042.576809333" Mar 20 13:41:33 crc kubenswrapper[4849]: I0320 13:41:33.877233 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrjcc" event={"ID":"d9ea786d-a24c-4b9c-9949-8a763f3be064","Type":"ContainerStarted","Data":"edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e"} Mar 20 13:41:33 crc kubenswrapper[4849]: I0320 13:41:33.879934 4849 generic.go:334] "Generic (PLEG): container finished" podID="0615834a-954f-4364-9c31-438c573df61a" containerID="9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c" exitCode=0 Mar 20 13:41:33 crc kubenswrapper[4849]: I0320 13:41:33.880680 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7h5h" event={"ID":"0615834a-954f-4364-9c31-438c573df61a","Type":"ContainerDied","Data":"9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c"} Mar 20 13:41:33 crc kubenswrapper[4849]: I0320 13:41:33.902077 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hrjcc" podStartSLOduration=22.422701532 podStartE2EDuration="24.902060236s" podCreationTimestamp="2026-03-20 13:41:09 +0000 UTC" firstStartedPulling="2026-03-20 13:41:30.834259472 +0000 UTC m=+1040.511982867" lastFinishedPulling="2026-03-20 13:41:33.313618176 +0000 UTC m=+1042.991341571" observedRunningTime="2026-03-20 13:41:33.89488575 +0000 UTC m=+1043.572609165" watchObservedRunningTime="2026-03-20 13:41:33.902060236 +0000 UTC m=+1043.579783641" Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.867957 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-mdm4l" Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.888367 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7h5h" event={"ID":"0615834a-954f-4364-9c31-438c573df61a","Type":"ContainerStarted","Data":"d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5"} Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.915699 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-spl5d" Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.919683 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-55knt" Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.923053 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h7h5h" podStartSLOduration=16.800997948 podStartE2EDuration="18.923033026s" podCreationTimestamp="2026-03-20 13:41:16 +0000 UTC" firstStartedPulling="2026-03-20 13:41:32.14715069 +0000 UTC m=+1041.824874085" lastFinishedPulling="2026-03-20 13:41:34.269185778 +0000 UTC m=+1043.946909163" observedRunningTime="2026-03-20 13:41:34.921232537 +0000 UTC m=+1044.598955932" watchObservedRunningTime="2026-03-20 13:41:34.923033026 +0000 UTC m=+1044.600756421" Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.975039 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-gw4br" Mar 20 13:41:34 crc kubenswrapper[4849]: I0320 13:41:34.988043 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-bzttm" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.050088 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-jvhjd" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.079174 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-8thmf" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.278105 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-m9dm8" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.286165 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-m6v2j" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.313613 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-m4gkc" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.332691 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-b78d5" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.386185 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-d8d62" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.401640 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6s2qr" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.430557 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-frmv4" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.477605 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-kfh92" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.606434 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rlm4l" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.900259 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" event={"ID":"3fdffb87-786f-4c4e-88fd-b1dd7bcf728d","Type":"ContainerStarted","Data":"b70ba1af04fb073381abb5c253869972dbba2a7895d90666540284145308a710"} Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.900803 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:35 crc kubenswrapper[4849]: I0320 13:41:35.925787 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" podStartSLOduration=2.269754664 podStartE2EDuration="31.925751308s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:05.996695582 +0000 UTC m=+1015.674418977" lastFinishedPulling="2026-03-20 13:41:35.652692226 +0000 UTC m=+1045.330415621" observedRunningTime="2026-03-20 13:41:35.914789108 +0000 UTC m=+1045.592512493" watchObservedRunningTime="2026-03-20 13:41:35.925751308 +0000 UTC m=+1045.603474743" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.399775 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.399849 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.440040 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.626225 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.637753 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03c23473-4b4c-4e24-92f6-69363a9cf363-cert\") pod \"infra-operator-controller-manager-669fff9c7c-v8lc5\" (UID: \"03c23473-4b4c-4e24-92f6-69363a9cf363\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.928208 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ktcdg" Mar 20 13:41:36 crc kubenswrapper[4849]: I0320 13:41:36.936667 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:37 crc kubenswrapper[4849]: I0320 13:41:37.352683 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5"] Mar 20 13:41:37 crc kubenswrapper[4849]: I0320 13:41:37.918677 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" event={"ID":"03c23473-4b4c-4e24-92f6-69363a9cf363","Type":"ContainerStarted","Data":"60e97472369322833ee77978165494f0fc5b72f5c425bc299a338968b6a0d8c5"} Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.384334 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.384755 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.384831 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.385566 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"796d63258641a3af91f7958992403b9a5ad9b68fcc83db460f8e4cc151f123e0"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.385619 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://796d63258641a3af91f7958992403b9a5ad9b68fcc83db460f8e4cc151f123e0" gracePeriod=600 Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.611748 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.612158 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.650998 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.936905 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" event={"ID":"03c23473-4b4c-4e24-92f6-69363a9cf363","Type":"ContainerStarted","Data":"b7683f589e135e0c3e96ea76eb93e52ce5d0515a7a6c08d136944101e153c787"} Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.937228 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.938569 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" event={"ID":"f95c179b-0dc5-4cea-98e7-7df754c3c0e2","Type":"ContainerStarted","Data":"928eabac6940b6b7e0e000b61235b52fc873c4c35af90537b067f44abfc93caa"} Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.938877 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.941013 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="796d63258641a3af91f7958992403b9a5ad9b68fcc83db460f8e4cc151f123e0" exitCode=0 Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.941069 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"796d63258641a3af91f7958992403b9a5ad9b68fcc83db460f8e4cc151f123e0"} Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.941097 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"320fbdc873fdc9693c329a47d54d9c46e735feb487e1c2d7c4da734e3de67821"} Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.941121 4849 scope.go:117] "RemoveContainer" containerID="b38365e3077d108f503fa5f04333e6db13420f95bbb3b1017d115a7dd6908444" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.957375 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" podStartSLOduration=33.968779537 podStartE2EDuration="35.957350229s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:37.357013169 +0000 UTC m=+1047.034736564" lastFinishedPulling="2026-03-20 13:41:39.345583861 +0000 UTC m=+1049.023307256" observedRunningTime="2026-03-20 13:41:39.952070985 +0000 UTC m=+1049.629794380" watchObservedRunningTime="2026-03-20 13:41:39.957350229 +0000 UTC m=+1049.635073634" Mar 20 13:41:39 crc kubenswrapper[4849]: I0320 13:41:39.987687 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" podStartSLOduration=2.673799046 podStartE2EDuration="35.987667077s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.238253534 +0000 UTC m=+1015.915976929" lastFinishedPulling="2026-03-20 13:41:39.552121565 +0000 UTC m=+1049.229844960" observedRunningTime="2026-03-20 13:41:39.984604594 +0000 UTC m=+1049.662327989" watchObservedRunningTime="2026-03-20 13:41:39.987667077 +0000 UTC m=+1049.665390472" Mar 20 13:41:40 crc kubenswrapper[4849]: I0320 13:41:40.013808 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:40 crc kubenswrapper[4849]: I0320 13:41:40.073724 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hrjcc"] Mar 20 13:41:40 crc kubenswrapper[4849]: I0320 13:41:40.952062 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" event={"ID":"94697f17-6007-4cd9-9eb6-04832d0e94c6","Type":"ContainerStarted","Data":"dd2d49b7456a821a21701baa0dd6c204138ae695596715b03feb15b616a41207"} Mar 20 13:41:40 crc kubenswrapper[4849]: I0320 13:41:40.960520 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5zg2mh" Mar 20 13:41:40 crc kubenswrapper[4849]: I0320 13:41:40.974698 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" podStartSLOduration=2.475108538 podStartE2EDuration="36.974680039s" podCreationTimestamp="2026-03-20 13:41:04 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.007758437 +0000 UTC m=+1015.685481832" lastFinishedPulling="2026-03-20 13:41:40.507329938 +0000 UTC m=+1050.185053333" observedRunningTime="2026-03-20 13:41:40.968851639 +0000 UTC m=+1050.646575044" watchObservedRunningTime="2026-03-20 13:41:40.974680039 +0000 UTC m=+1050.652403434" Mar 20 13:41:41 crc kubenswrapper[4849]: I0320 13:41:41.301856 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-jhdbx" Mar 20 13:41:41 crc kubenswrapper[4849]: I0320 13:41:41.963601 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hrjcc" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="registry-server" containerID="cri-o://edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e" gracePeriod=2 Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.338446 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.404812 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqh8p\" (UniqueName: \"kubernetes.io/projected/d9ea786d-a24c-4b9c-9949-8a763f3be064-kube-api-access-sqh8p\") pod \"d9ea786d-a24c-4b9c-9949-8a763f3be064\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.404961 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-catalog-content\") pod \"d9ea786d-a24c-4b9c-9949-8a763f3be064\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.405042 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-utilities\") pod \"d9ea786d-a24c-4b9c-9949-8a763f3be064\" (UID: \"d9ea786d-a24c-4b9c-9949-8a763f3be064\") " Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.405912 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-utilities" (OuterVolumeSpecName: "utilities") pod "d9ea786d-a24c-4b9c-9949-8a763f3be064" (UID: "d9ea786d-a24c-4b9c-9949-8a763f3be064"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.410035 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ea786d-a24c-4b9c-9949-8a763f3be064-kube-api-access-sqh8p" (OuterVolumeSpecName: "kube-api-access-sqh8p") pod "d9ea786d-a24c-4b9c-9949-8a763f3be064" (UID: "d9ea786d-a24c-4b9c-9949-8a763f3be064"). InnerVolumeSpecName "kube-api-access-sqh8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.458122 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9ea786d-a24c-4b9c-9949-8a763f3be064" (UID: "d9ea786d-a24c-4b9c-9949-8a763f3be064"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.505991 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.506026 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9ea786d-a24c-4b9c-9949-8a763f3be064-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.506037 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqh8p\" (UniqueName: \"kubernetes.io/projected/d9ea786d-a24c-4b9c-9949-8a763f3be064-kube-api-access-sqh8p\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.975158 4849 generic.go:334] "Generic (PLEG): container finished" podID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerID="edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e" exitCode=0 Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.975230 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrjcc" event={"ID":"d9ea786d-a24c-4b9c-9949-8a763f3be064","Type":"ContainerDied","Data":"edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e"} Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.975279 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrjcc" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.975634 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrjcc" event={"ID":"d9ea786d-a24c-4b9c-9949-8a763f3be064","Type":"ContainerDied","Data":"5e01fbdd502c5841603aff5aa17c5bf9699f75057352d61dd9aed73a729dca65"} Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.975687 4849 scope.go:117] "RemoveContainer" containerID="edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e" Mar 20 13:41:42 crc kubenswrapper[4849]: I0320 13:41:42.998685 4849 scope.go:117] "RemoveContainer" containerID="2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.012888 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hrjcc"] Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.019324 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hrjcc"] Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.042301 4849 scope.go:117] "RemoveContainer" containerID="13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.046260 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" path="/var/lib/kubelet/pods/d9ea786d-a24c-4b9c-9949-8a763f3be064/volumes" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.055663 4849 scope.go:117] "RemoveContainer" containerID="edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e" Mar 20 13:41:43 crc kubenswrapper[4849]: E0320 13:41:43.056128 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e\": container with ID starting with edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e not found: ID does not exist" containerID="edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.056208 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e"} err="failed to get container status \"edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e\": rpc error: code = NotFound desc = could not find container \"edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e\": container with ID starting with edd2f84e8f8e19ed76d220360ce7ce3845b50be55d52fa92315db6a600913a3e not found: ID does not exist" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.056257 4849 scope.go:117] "RemoveContainer" containerID="2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd" Mar 20 13:41:43 crc kubenswrapper[4849]: E0320 13:41:43.056578 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd\": container with ID starting with 2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd not found: ID does not exist" containerID="2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.056623 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd"} err="failed to get container status \"2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd\": rpc error: code = NotFound desc = could not find container \"2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd\": container with ID starting with 2afe02568241ce23137be84b829c1d62f6032fb446c868218b8379a3071652fd not found: ID does not exist" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.056655 4849 scope.go:117] "RemoveContainer" containerID="13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d" Mar 20 13:41:43 crc kubenswrapper[4849]: E0320 13:41:43.056883 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d\": container with ID starting with 13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d not found: ID does not exist" containerID="13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d" Mar 20 13:41:43 crc kubenswrapper[4849]: I0320 13:41:43.056911 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d"} err="failed to get container status \"13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d\": rpc error: code = NotFound desc = could not find container \"13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d\": container with ID starting with 13cd9346c76ff0e35643b5e45f81b9208f19768d8bbccbc49a14b7357aa4603d not found: ID does not exist" Mar 20 13:41:45 crc kubenswrapper[4849]: I0320 13:41:45.174235 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:45 crc kubenswrapper[4849]: I0320 13:41:45.176582 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-vvscl" Mar 20 13:41:45 crc kubenswrapper[4849]: I0320 13:41:45.187783 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-drbhr" Mar 20 13:41:45 crc kubenswrapper[4849]: I0320 13:41:45.307156 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-z2xd8" Mar 20 13:41:46 crc kubenswrapper[4849]: I0320 13:41:46.453216 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:46 crc kubenswrapper[4849]: I0320 13:41:46.676131 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7h5h"] Mar 20 13:41:46 crc kubenswrapper[4849]: I0320 13:41:46.945623 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-v8lc5" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.002118 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h7h5h" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="registry-server" containerID="cri-o://d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5" gracePeriod=2 Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.410675 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.585878 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-catalog-content\") pod \"0615834a-954f-4364-9c31-438c573df61a\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.585939 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-utilities\") pod \"0615834a-954f-4364-9c31-438c573df61a\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.586140 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw2cj\" (UniqueName: \"kubernetes.io/projected/0615834a-954f-4364-9c31-438c573df61a-kube-api-access-kw2cj\") pod \"0615834a-954f-4364-9c31-438c573df61a\" (UID: \"0615834a-954f-4364-9c31-438c573df61a\") " Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.587196 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-utilities" (OuterVolumeSpecName: "utilities") pod "0615834a-954f-4364-9c31-438c573df61a" (UID: "0615834a-954f-4364-9c31-438c573df61a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.594876 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0615834a-954f-4364-9c31-438c573df61a-kube-api-access-kw2cj" (OuterVolumeSpecName: "kube-api-access-kw2cj") pod "0615834a-954f-4364-9c31-438c573df61a" (UID: "0615834a-954f-4364-9c31-438c573df61a"). InnerVolumeSpecName "kube-api-access-kw2cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.614224 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0615834a-954f-4364-9c31-438c573df61a" (UID: "0615834a-954f-4364-9c31-438c573df61a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.688437 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.688478 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0615834a-954f-4364-9c31-438c573df61a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:47 crc kubenswrapper[4849]: I0320 13:41:47.688494 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw2cj\" (UniqueName: \"kubernetes.io/projected/0615834a-954f-4364-9c31-438c573df61a-kube-api-access-kw2cj\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.010075 4849 generic.go:334] "Generic (PLEG): container finished" podID="0615834a-954f-4364-9c31-438c573df61a" containerID="d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5" exitCode=0 Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.010122 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7h5h" event={"ID":"0615834a-954f-4364-9c31-438c573df61a","Type":"ContainerDied","Data":"d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5"} Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.010138 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7h5h" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.010161 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7h5h" event={"ID":"0615834a-954f-4364-9c31-438c573df61a","Type":"ContainerDied","Data":"a45a95baba7582b0217bb96be8c377e659ee6f32d2f1b729bc2794f0d8c882ec"} Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.010198 4849 scope.go:117] "RemoveContainer" containerID="d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.032193 4849 scope.go:117] "RemoveContainer" containerID="9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.038987 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7h5h"] Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.047032 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7h5h"] Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.206442 4849 scope.go:117] "RemoveContainer" containerID="d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.542691 4849 scope.go:117] "RemoveContainer" containerID="d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5" Mar 20 13:41:48 crc kubenswrapper[4849]: E0320 13:41:48.543169 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5\": container with ID starting with d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5 not found: ID does not exist" containerID="d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.543312 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5"} err="failed to get container status \"d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5\": rpc error: code = NotFound desc = could not find container \"d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5\": container with ID starting with d256b7267c644bd5898809caa2fa3eac894f6d39b92d5427f2c06c64de67a1f5 not found: ID does not exist" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.543366 4849 scope.go:117] "RemoveContainer" containerID="9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c" Mar 20 13:41:48 crc kubenswrapper[4849]: E0320 13:41:48.543870 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c\": container with ID starting with 9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c not found: ID does not exist" containerID="9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.543909 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c"} err="failed to get container status \"9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c\": rpc error: code = NotFound desc = could not find container \"9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c\": container with ID starting with 9580b8f750a7f4e3fb120847a51e82d1225df35d3009d3d64f9d44cd38f6953c not found: ID does not exist" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.543969 4849 scope.go:117] "RemoveContainer" containerID="d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10" Mar 20 13:41:48 crc kubenswrapper[4849]: E0320 13:41:48.544367 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10\": container with ID starting with d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10 not found: ID does not exist" containerID="d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10" Mar 20 13:41:48 crc kubenswrapper[4849]: I0320 13:41:48.544391 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10"} err="failed to get container status \"d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10\": rpc error: code = NotFound desc = could not find container \"d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10\": container with ID starting with d389682a2d43a4bea2e8cd1b7ce5233e710d7006d416ada9063d6df14cd1fb10 not found: ID does not exist" Mar 20 13:41:49 crc kubenswrapper[4849]: I0320 13:41:49.044205 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0615834a-954f-4364-9c31-438c573df61a" path="/var/lib/kubelet/pods/0615834a-954f-4364-9c31-438c573df61a/volumes" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.130399 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566902-hv5cs"] Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131256 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="extract-content" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131273 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="extract-content" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131288 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131296 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131305 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="extract-content" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131312 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="extract-content" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131330 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="extract-utilities" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131339 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="extract-utilities" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131354 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="extract-utilities" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131362 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="extract-utilities" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131371 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131378 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131389 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131396 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131406 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="extract-content" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131413 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="extract-content" Mar 20 13:42:00 crc kubenswrapper[4849]: E0320 13:42:00.131424 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="extract-utilities" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131431 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="extract-utilities" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131587 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="79fa63f2-b3ce-45c8-8b3a-4673f91ab6e1" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131602 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0615834a-954f-4364-9c31-438c573df61a" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.131615 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ea786d-a24c-4b9c-9949-8a763f3be064" containerName="registry-server" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.132217 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.134020 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.134020 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.134999 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.144146 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-hv5cs"] Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.269054 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r59k\" (UniqueName: \"kubernetes.io/projected/a73c059a-d570-4e73-8119-9f14474c9d99-kube-api-access-7r59k\") pod \"auto-csr-approver-29566902-hv5cs\" (UID: \"a73c059a-d570-4e73-8119-9f14474c9d99\") " pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.371839 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r59k\" (UniqueName: \"kubernetes.io/projected/a73c059a-d570-4e73-8119-9f14474c9d99-kube-api-access-7r59k\") pod \"auto-csr-approver-29566902-hv5cs\" (UID: \"a73c059a-d570-4e73-8119-9f14474c9d99\") " pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.402671 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r59k\" (UniqueName: \"kubernetes.io/projected/a73c059a-d570-4e73-8119-9f14474c9d99-kube-api-access-7r59k\") pod \"auto-csr-approver-29566902-hv5cs\" (UID: \"a73c059a-d570-4e73-8119-9f14474c9d99\") " pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.452246 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:00 crc kubenswrapper[4849]: I0320 13:42:00.885496 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-hv5cs"] Mar 20 13:42:01 crc kubenswrapper[4849]: I0320 13:42:01.101324 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" event={"ID":"a73c059a-d570-4e73-8119-9f14474c9d99","Type":"ContainerStarted","Data":"e4181ea0c16066a0622bc39cb877762da3af6d6a5ccd0ea87dec27b30fe32209"} Mar 20 13:42:03 crc kubenswrapper[4849]: I0320 13:42:03.131464 4849 generic.go:334] "Generic (PLEG): container finished" podID="a73c059a-d570-4e73-8119-9f14474c9d99" containerID="d9e371ebb37def4d04e95c4b419632fe0f70b889bee2d0cd5866c0702abd9deb" exitCode=0 Mar 20 13:42:03 crc kubenswrapper[4849]: I0320 13:42:03.131552 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" event={"ID":"a73c059a-d570-4e73-8119-9f14474c9d99","Type":"ContainerDied","Data":"d9e371ebb37def4d04e95c4b419632fe0f70b889bee2d0cd5866c0702abd9deb"} Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.509442 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.634564 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r59k\" (UniqueName: \"kubernetes.io/projected/a73c059a-d570-4e73-8119-9f14474c9d99-kube-api-access-7r59k\") pod \"a73c059a-d570-4e73-8119-9f14474c9d99\" (UID: \"a73c059a-d570-4e73-8119-9f14474c9d99\") " Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.641528 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73c059a-d570-4e73-8119-9f14474c9d99-kube-api-access-7r59k" (OuterVolumeSpecName: "kube-api-access-7r59k") pod "a73c059a-d570-4e73-8119-9f14474c9d99" (UID: "a73c059a-d570-4e73-8119-9f14474c9d99"). InnerVolumeSpecName "kube-api-access-7r59k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.687875 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bsb54"] Mar 20 13:42:04 crc kubenswrapper[4849]: E0320 13:42:04.688259 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73c059a-d570-4e73-8119-9f14474c9d99" containerName="oc" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.688281 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73c059a-d570-4e73-8119-9f14474c9d99" containerName="oc" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.688428 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73c059a-d570-4e73-8119-9f14474c9d99" containerName="oc" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.689287 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.691277 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.695168 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kdsbj" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.695208 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.695322 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.701959 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bsb54"] Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.739050 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r59k\" (UniqueName: \"kubernetes.io/projected/a73c059a-d570-4e73-8119-9f14474c9d99-kube-api-access-7r59k\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.774626 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65z78"] Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.775702 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.777666 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.791388 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65z78"] Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.847556 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298817-d1eb-4db5-9783-700b3a312bd7-config\") pod \"dnsmasq-dns-675f4bcbfc-bsb54\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.847655 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plkk\" (UniqueName: \"kubernetes.io/projected/ed298817-d1eb-4db5-9783-700b3a312bd7-kube-api-access-8plkk\") pod \"dnsmasq-dns-675f4bcbfc-bsb54\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.948491 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.948542 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkt8q\" (UniqueName: \"kubernetes.io/projected/82070c55-8507-4a82-8872-053f269f0cba-kube-api-access-vkt8q\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.948606 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298817-d1eb-4db5-9783-700b3a312bd7-config\") pod \"dnsmasq-dns-675f4bcbfc-bsb54\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.948652 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-config\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.948699 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plkk\" (UniqueName: \"kubernetes.io/projected/ed298817-d1eb-4db5-9783-700b3a312bd7-kube-api-access-8plkk\") pod \"dnsmasq-dns-675f4bcbfc-bsb54\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.949732 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298817-d1eb-4db5-9783-700b3a312bd7-config\") pod \"dnsmasq-dns-675f4bcbfc-bsb54\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:04 crc kubenswrapper[4849]: I0320 13:42:04.975687 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plkk\" (UniqueName: \"kubernetes.io/projected/ed298817-d1eb-4db5-9783-700b3a312bd7-kube-api-access-8plkk\") pod \"dnsmasq-dns-675f4bcbfc-bsb54\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.010492 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.049495 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-config\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.049877 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.049900 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkt8q\" (UniqueName: \"kubernetes.io/projected/82070c55-8507-4a82-8872-053f269f0cba-kube-api-access-vkt8q\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.050348 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-config\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.050531 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.070100 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkt8q\" (UniqueName: \"kubernetes.io/projected/82070c55-8507-4a82-8872-053f269f0cba-kube-api-access-vkt8q\") pod \"dnsmasq-dns-78dd6ddcc-65z78\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.096288 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.161422 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" event={"ID":"a73c059a-d570-4e73-8119-9f14474c9d99","Type":"ContainerDied","Data":"e4181ea0c16066a0622bc39cb877762da3af6d6a5ccd0ea87dec27b30fe32209"} Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.161468 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4181ea0c16066a0622bc39cb877762da3af6d6a5ccd0ea87dec27b30fe32209" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.161529 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-hv5cs" Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.424290 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bsb54"] Mar 20 13:42:05 crc kubenswrapper[4849]: W0320 13:42:05.427297 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded298817_d1eb_4db5_9783_700b3a312bd7.slice/crio-6bbcd513b9079b47d0a17440273a7aa9e004c724f58726cca36e7d62fab28161 WatchSource:0}: Error finding container 6bbcd513b9079b47d0a17440273a7aa9e004c724f58726cca36e7d62fab28161: Status 404 returned error can't find the container with id 6bbcd513b9079b47d0a17440273a7aa9e004c724f58726cca36e7d62fab28161 Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.550661 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65z78"] Mar 20 13:42:05 crc kubenswrapper[4849]: W0320 13:42:05.554250 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82070c55_8507_4a82_8872_053f269f0cba.slice/crio-03be55d9d184736620d720742fc0700a7651286ab590e8b2b35d67381845a9ac WatchSource:0}: Error finding container 03be55d9d184736620d720742fc0700a7651286ab590e8b2b35d67381845a9ac: Status 404 returned error can't find the container with id 03be55d9d184736620d720742fc0700a7651286ab590e8b2b35d67381845a9ac Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.576626 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-vljgv"] Mar 20 13:42:05 crc kubenswrapper[4849]: I0320 13:42:05.586947 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-vljgv"] Mar 20 13:42:06 crc kubenswrapper[4849]: I0320 13:42:06.172101 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" event={"ID":"ed298817-d1eb-4db5-9783-700b3a312bd7","Type":"ContainerStarted","Data":"6bbcd513b9079b47d0a17440273a7aa9e004c724f58726cca36e7d62fab28161"} Mar 20 13:42:06 crc kubenswrapper[4849]: I0320 13:42:06.173285 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" event={"ID":"82070c55-8507-4a82-8872-053f269f0cba","Type":"ContainerStarted","Data":"03be55d9d184736620d720742fc0700a7651286ab590e8b2b35d67381845a9ac"} Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.046752 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4518bf0-8998-47ae-bf3b-8920118e0aed" path="/var/lib/kubelet/pods/b4518bf0-8998-47ae-bf3b-8920118e0aed/volumes" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.553242 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bsb54"] Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.570205 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qvbj6"] Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.574376 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.584801 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qvbj6"] Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.587632 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngvwt\" (UniqueName: \"kubernetes.io/projected/2358ecf4-6327-4cc9-bcc5-c822e2215540-kube-api-access-ngvwt\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.587689 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-config\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.587761 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-dns-svc\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.691125 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-dns-svc\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.691336 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngvwt\" (UniqueName: \"kubernetes.io/projected/2358ecf4-6327-4cc9-bcc5-c822e2215540-kube-api-access-ngvwt\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.691371 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-config\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.692275 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-dns-svc\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.693835 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-config\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.723140 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngvwt\" (UniqueName: \"kubernetes.io/projected/2358ecf4-6327-4cc9-bcc5-c822e2215540-kube-api-access-ngvwt\") pod \"dnsmasq-dns-666b6646f7-qvbj6\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.796025 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65z78"] Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.812268 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s4zh7"] Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.814022 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.837921 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s4zh7"] Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.893228 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-config\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.893272 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.893298 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2mzk\" (UniqueName: \"kubernetes.io/projected/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-kube-api-access-v2mzk\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.941528 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.994041 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-config\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.994154 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.994219 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2mzk\" (UniqueName: \"kubernetes.io/projected/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-kube-api-access-v2mzk\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.994884 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-config\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:07 crc kubenswrapper[4849]: I0320 13:42:07.994923 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.013987 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2mzk\" (UniqueName: \"kubernetes.io/projected/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-kube-api-access-v2mzk\") pod \"dnsmasq-dns-57d769cc4f-s4zh7\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.158561 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.409574 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s4zh7"] Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.417587 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qvbj6"] Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.536583 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.542438 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.547211 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.547361 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.547365 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.547511 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.547574 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.547949 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.548898 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vvnpq" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.553580 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706384 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706618 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706641 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706673 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-config-data\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706692 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/464306bd-0d8b-40ca-aa64-1ec5a00a527b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706726 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706760 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706777 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/464306bd-0d8b-40ca-aa64-1ec5a00a527b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706800 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706831 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mv6r\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-kube-api-access-7mv6r\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.706967 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.770252 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.771766 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.774878 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.775042 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.775146 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-t2n6r" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.775275 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.775422 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.775504 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.775292 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.778981 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.807689 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.807739 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.807761 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.807777 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.807799 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-config-data\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.807836 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/464306bd-0d8b-40ca-aa64-1ec5a00a527b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808248 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808294 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808310 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/464306bd-0d8b-40ca-aa64-1ec5a00a527b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808335 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808352 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mv6r\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-kube-api-access-7mv6r\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808426 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.808654 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.809580 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.811106 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-config-data\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.815805 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/464306bd-0d8b-40ca-aa64-1ec5a00a527b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.816219 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.818247 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/464306bd-0d8b-40ca-aa64-1ec5a00a527b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.818360 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/464306bd-0d8b-40ca-aa64-1ec5a00a527b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.823164 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.838682 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mv6r\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-kube-api-access-7mv6r\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.840634 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/464306bd-0d8b-40ca-aa64-1ec5a00a527b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.869343 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"464306bd-0d8b-40ca-aa64-1ec5a00a527b\") " pod="openstack/rabbitmq-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910219 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxh84\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-kube-api-access-zxh84\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910256 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910291 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910316 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910346 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c3c4952-4c22-4389-834c-969b89fb9e20-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910369 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910844 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910874 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c3c4952-4c22-4389-834c-969b89fb9e20-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910919 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910965 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:08 crc kubenswrapper[4849]: I0320 13:42:08.910995 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012578 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c3c4952-4c22-4389-834c-969b89fb9e20-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012658 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012683 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012704 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c3c4952-4c22-4389-834c-969b89fb9e20-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012725 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012754 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012771 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012792 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxh84\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-kube-api-access-zxh84\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012830 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012865 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.012889 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.013807 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.014270 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.014721 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.015611 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.017684 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.019141 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3c3c4952-4c22-4389-834c-969b89fb9e20-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.019568 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3c3c4952-4c22-4389-834c-969b89fb9e20-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.027108 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.028796 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3c3c4952-4c22-4389-834c-969b89fb9e20-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.030922 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxh84\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-kube-api-access-zxh84\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.031283 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3c3c4952-4c22-4389-834c-969b89fb9e20-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.037160 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"3c3c4952-4c22-4389-834c-969b89fb9e20\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.168791 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:42:09 crc kubenswrapper[4849]: I0320 13:42:09.213656 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.146422 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.149231 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.151982 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.152385 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-n77wc" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.152883 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.159134 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.160966 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.165025 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.234810 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.234882 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.234938 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.234963 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.235000 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.235054 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.235077 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.235099 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsmq9\" (UniqueName: \"kubernetes.io/projected/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-kube-api-access-gsmq9\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336675 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336730 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336780 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336804 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336845 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336867 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsmq9\" (UniqueName: \"kubernetes.io/projected/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-kube-api-access-gsmq9\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336896 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.336919 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.337902 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.338524 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.338583 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-kolla-config\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.338705 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.338892 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-config-data-default\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.342952 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.343777 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.359992 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsmq9\" (UniqueName: \"kubernetes.io/projected/b4ef098b-892c-4619-a5b4-7c10cdf47f9b-kube-api-access-gsmq9\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.364519 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"b4ef098b-892c-4619-a5b4-7c10cdf47f9b\") " pod="openstack/openstack-galera-0" Mar 20 13:42:10 crc kubenswrapper[4849]: I0320 13:42:10.489961 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.491075 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.492930 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.495032 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8gmh9" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.495179 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.495526 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.496156 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.496303 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.554198 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f905722-c565-4fe5-bdde-0df02a23b833-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.554481 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f905722-c565-4fe5-bdde-0df02a23b833-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.554602 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.554717 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.554811 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmsxj\" (UniqueName: \"kubernetes.io/projected/4f905722-c565-4fe5-bdde-0df02a23b833-kube-api-access-tmsxj\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.554925 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.555051 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f905722-c565-4fe5-bdde-0df02a23b833-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.555157 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.656714 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f905722-c565-4fe5-bdde-0df02a23b833-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.656761 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.656893 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f905722-c565-4fe5-bdde-0df02a23b833-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.656938 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f905722-c565-4fe5-bdde-0df02a23b833-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.656970 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.656994 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.657017 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmsxj\" (UniqueName: \"kubernetes.io/projected/4f905722-c565-4fe5-bdde-0df02a23b833-kube-api-access-tmsxj\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.657038 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.657693 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.657860 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4f905722-c565-4fe5-bdde-0df02a23b833-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.658175 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.658802 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.659088 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f905722-c565-4fe5-bdde-0df02a23b833-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.662158 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f905722-c565-4fe5-bdde-0df02a23b833-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.663877 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f905722-c565-4fe5-bdde-0df02a23b833-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.679009 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmsxj\" (UniqueName: \"kubernetes.io/projected/4f905722-c565-4fe5-bdde-0df02a23b833-kube-api-access-tmsxj\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.679455 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4f905722-c565-4fe5-bdde-0df02a23b833\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.795467 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.796603 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.799243 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.799452 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2k2jq" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.807165 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.824330 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.842177 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.873837 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jbz\" (UniqueName: \"kubernetes.io/projected/200bac0a-008a-4528-bb22-3cf6e1ef6342-kube-api-access-j7jbz\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.873896 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/200bac0a-008a-4528-bb22-3cf6e1ef6342-config-data\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.873924 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/200bac0a-008a-4528-bb22-3cf6e1ef6342-memcached-tls-certs\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.874015 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/200bac0a-008a-4528-bb22-3cf6e1ef6342-kolla-config\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.874079 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200bac0a-008a-4528-bb22-3cf6e1ef6342-combined-ca-bundle\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.975863 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/200bac0a-008a-4528-bb22-3cf6e1ef6342-kolla-config\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.975946 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200bac0a-008a-4528-bb22-3cf6e1ef6342-combined-ca-bundle\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.976012 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jbz\" (UniqueName: \"kubernetes.io/projected/200bac0a-008a-4528-bb22-3cf6e1ef6342-kube-api-access-j7jbz\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.976045 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/200bac0a-008a-4528-bb22-3cf6e1ef6342-config-data\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.976069 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/200bac0a-008a-4528-bb22-3cf6e1ef6342-memcached-tls-certs\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.976812 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/200bac0a-008a-4528-bb22-3cf6e1ef6342-config-data\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.976849 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/200bac0a-008a-4528-bb22-3cf6e1ef6342-kolla-config\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.985325 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/200bac0a-008a-4528-bb22-3cf6e1ef6342-memcached-tls-certs\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.986968 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200bac0a-008a-4528-bb22-3cf6e1ef6342-combined-ca-bundle\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:11 crc kubenswrapper[4849]: I0320 13:42:11.997662 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jbz\" (UniqueName: \"kubernetes.io/projected/200bac0a-008a-4528-bb22-3cf6e1ef6342-kube-api-access-j7jbz\") pod \"memcached-0\" (UID: \"200bac0a-008a-4528-bb22-3cf6e1ef6342\") " pod="openstack/memcached-0" Mar 20 13:42:12 crc kubenswrapper[4849]: I0320 13:42:12.136562 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:42:12 crc kubenswrapper[4849]: I0320 13:42:12.231617 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" event={"ID":"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d","Type":"ContainerStarted","Data":"0e3b8da96699957defe3c14768e177465b2e79ea34734a88bbb2a903e1906720"} Mar 20 13:42:12 crc kubenswrapper[4849]: I0320 13:42:12.232704 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" event={"ID":"2358ecf4-6327-4cc9-bcc5-c822e2215540","Type":"ContainerStarted","Data":"5d6514b3c3b5177224a2b278075f01fcd03f7e19cb6280f672d49dd2d4db61b7"} Mar 20 13:42:13 crc kubenswrapper[4849]: I0320 13:42:13.958979 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:42:13 crc kubenswrapper[4849]: I0320 13:42:13.961659 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:42:13 crc kubenswrapper[4849]: I0320 13:42:13.968239 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5zxvv" Mar 20 13:42:13 crc kubenswrapper[4849]: I0320 13:42:13.970658 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:42:14 crc kubenswrapper[4849]: I0320 13:42:14.017213 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/e53df741-614d-449c-8da6-4de0333a6e9b-kube-api-access-hqpqr\") pod \"kube-state-metrics-0\" (UID: \"e53df741-614d-449c-8da6-4de0333a6e9b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:42:14 crc kubenswrapper[4849]: I0320 13:42:14.120141 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/e53df741-614d-449c-8da6-4de0333a6e9b-kube-api-access-hqpqr\") pod \"kube-state-metrics-0\" (UID: \"e53df741-614d-449c-8da6-4de0333a6e9b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:42:14 crc kubenswrapper[4849]: I0320 13:42:14.138055 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/e53df741-614d-449c-8da6-4de0333a6e9b-kube-api-access-hqpqr\") pod \"kube-state-metrics-0\" (UID: \"e53df741-614d-449c-8da6-4de0333a6e9b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:42:14 crc kubenswrapper[4849]: I0320 13:42:14.326662 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.206183 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9znfj"] Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.207634 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.211250 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bsxdf" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.211384 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.211481 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.215225 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9znfj"] Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.239998 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-226bs"] Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.241563 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.262000 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-226bs"] Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361603 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-log-ovn\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361658 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57363bb0-8542-49ea-95b9-84fd9206f644-scripts\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361688 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-run\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361711 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-lib\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361796 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f589037a-06aa-452d-82ef-0dbf2177b7fc-ovn-controller-tls-certs\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361831 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-etc-ovs\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361876 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f589037a-06aa-452d-82ef-0dbf2177b7fc-scripts\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.361905 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5ph\" (UniqueName: \"kubernetes.io/projected/57363bb0-8542-49ea-95b9-84fd9206f644-kube-api-access-fl5ph\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.362475 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbsgt\" (UniqueName: \"kubernetes.io/projected/f589037a-06aa-452d-82ef-0dbf2177b7fc-kube-api-access-jbsgt\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.362497 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-log\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.362551 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-run-ovn\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.395075 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-run\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.395162 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f589037a-06aa-452d-82ef-0dbf2177b7fc-combined-ca-bundle\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496302 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f589037a-06aa-452d-82ef-0dbf2177b7fc-ovn-controller-tls-certs\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496367 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-etc-ovs\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496394 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f589037a-06aa-452d-82ef-0dbf2177b7fc-scripts\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496421 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5ph\" (UniqueName: \"kubernetes.io/projected/57363bb0-8542-49ea-95b9-84fd9206f644-kube-api-access-fl5ph\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496444 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbsgt\" (UniqueName: \"kubernetes.io/projected/f589037a-06aa-452d-82ef-0dbf2177b7fc-kube-api-access-jbsgt\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496465 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-log\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496514 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-run-ovn\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496531 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-run\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496560 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f589037a-06aa-452d-82ef-0dbf2177b7fc-combined-ca-bundle\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496586 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-log-ovn\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496609 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57363bb0-8542-49ea-95b9-84fd9206f644-scripts\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496631 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-run\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.496651 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-lib\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.497264 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-lib\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.497900 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-log\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.498091 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-run-ovn\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.498099 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-etc-ovs\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.498223 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-run\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.499433 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57363bb0-8542-49ea-95b9-84fd9206f644-scripts\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.499522 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/57363bb0-8542-49ea-95b9-84fd9206f644-var-run\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.499594 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f589037a-06aa-452d-82ef-0dbf2177b7fc-var-log-ovn\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.511270 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f589037a-06aa-452d-82ef-0dbf2177b7fc-ovn-controller-tls-certs\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.513638 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f589037a-06aa-452d-82ef-0dbf2177b7fc-combined-ca-bundle\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.515091 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbsgt\" (UniqueName: \"kubernetes.io/projected/f589037a-06aa-452d-82ef-0dbf2177b7fc-kube-api-access-jbsgt\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.515636 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5ph\" (UniqueName: \"kubernetes.io/projected/57363bb0-8542-49ea-95b9-84fd9206f644-kube-api-access-fl5ph\") pod \"ovn-controller-ovs-226bs\" (UID: \"57363bb0-8542-49ea-95b9-84fd9206f644\") " pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.516335 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f589037a-06aa-452d-82ef-0dbf2177b7fc-scripts\") pod \"ovn-controller-9znfj\" (UID: \"f589037a-06aa-452d-82ef-0dbf2177b7fc\") " pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.534291 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9znfj" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.557881 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.645030 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.646124 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.651972 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.652006 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.652196 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-d8r9l" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.652785 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.658331 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.672116 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699163 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699217 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699247 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzql\" (UniqueName: \"kubernetes.io/projected/91eeca7c-4c91-4b2f-8541-be7b6a36b582-kube-api-access-pnzql\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699274 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91eeca7c-4c91-4b2f-8541-be7b6a36b582-config\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699298 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91eeca7c-4c91-4b2f-8541-be7b6a36b582-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699314 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699359 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.699386 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91eeca7c-4c91-4b2f-8541-be7b6a36b582-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801022 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801091 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801129 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzql\" (UniqueName: \"kubernetes.io/projected/91eeca7c-4c91-4b2f-8541-be7b6a36b582-kube-api-access-pnzql\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801164 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91eeca7c-4c91-4b2f-8541-be7b6a36b582-config\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801198 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91eeca7c-4c91-4b2f-8541-be7b6a36b582-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801227 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801280 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.801397 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91eeca7c-4c91-4b2f-8541-be7b6a36b582-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.802178 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/91eeca7c-4c91-4b2f-8541-be7b6a36b582-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.802336 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.802599 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91eeca7c-4c91-4b2f-8541-be7b6a36b582-config\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.802910 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/91eeca7c-4c91-4b2f-8541-be7b6a36b582-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.806114 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.806478 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.813537 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91eeca7c-4c91-4b2f-8541-be7b6a36b582-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.823498 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzql\" (UniqueName: \"kubernetes.io/projected/91eeca7c-4c91-4b2f-8541-be7b6a36b582-kube-api-access-pnzql\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:17 crc kubenswrapper[4849]: I0320 13:42:17.845127 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"91eeca7c-4c91-4b2f-8541-be7b6a36b582\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:18 crc kubenswrapper[4849]: I0320 13:42:18.004170 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:19 crc kubenswrapper[4849]: E0320 13:42:19.720583 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:42:19 crc kubenswrapper[4849]: E0320 13:42:19.721064 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vkt8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-65z78_openstack(82070c55-8507-4a82-8872-053f269f0cba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:19 crc kubenswrapper[4849]: E0320 13:42:19.722562 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" podUID="82070c55-8507-4a82-8872-053f269f0cba" Mar 20 13:42:19 crc kubenswrapper[4849]: E0320 13:42:19.728990 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:42:19 crc kubenswrapper[4849]: E0320 13:42:19.729129 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8plkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bsb54_openstack(ed298817-d1eb-4db5-9783-700b3a312bd7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:19 crc kubenswrapper[4849]: E0320 13:42:19.730319 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" podUID="ed298817-d1eb-4db5-9783-700b3a312bd7" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.267347 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.287823 4849 generic.go:334] "Generic (PLEG): container finished" podID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerID="21a13c4d8384be90d2e2786b5b8ff8d7c438c4a07a60b64f355f5e37309b8d47" exitCode=0 Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.288535 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" event={"ID":"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d","Type":"ContainerDied","Data":"21a13c4d8384be90d2e2786b5b8ff8d7c438c4a07a60b64f355f5e37309b8d47"} Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.297987 4849 generic.go:334] "Generic (PLEG): container finished" podID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerID="c55d51de1e429f7a6122819d9c859c2241d88ee921f1b150214002d0bb1ad1fc" exitCode=0 Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.298093 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" event={"ID":"2358ecf4-6327-4cc9-bcc5-c822e2215540","Type":"ContainerDied","Data":"c55d51de1e429f7a6122819d9c859c2241d88ee921f1b150214002d0bb1ad1fc"} Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.399392 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.416629 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:42:20 crc kubenswrapper[4849]: W0320 13:42:20.421071 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f905722_c565_4fe5_bdde_0df02a23b833.slice/crio-041e0f25101f842c07e7810e555e6f1a4177a3f132793d1fadf77d4206a795f4 WatchSource:0}: Error finding container 041e0f25101f842c07e7810e555e6f1a4177a3f132793d1fadf77d4206a795f4: Status 404 returned error can't find the container with id 041e0f25101f842c07e7810e555e6f1a4177a3f132793d1fadf77d4206a795f4 Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.485872 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.534922 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9znfj"] Mar 20 13:42:20 crc kubenswrapper[4849]: W0320 13:42:20.536793 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3c4952_4c22_4389_834c_969b89fb9e20.slice/crio-d145f502a1654fc592c638ac3f4d7252d0cc5f9404df487223878eb91feb7806 WatchSource:0}: Error finding container d145f502a1654fc592c638ac3f4d7252d0cc5f9404df487223878eb91feb7806: Status 404 returned error can't find the container with id d145f502a1654fc592c638ac3f4d7252d0cc5f9404df487223878eb91feb7806 Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.549603 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.557432 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:42:20 crc kubenswrapper[4849]: E0320 13:42:20.607890 4849 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 13:42:20 crc kubenswrapper[4849]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/2358ecf4-6327-4cc9-bcc5-c822e2215540/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:42:20 crc kubenswrapper[4849]: > podSandboxID="5d6514b3c3b5177224a2b278075f01fcd03f7e19cb6280f672d49dd2d4db61b7" Mar 20 13:42:20 crc kubenswrapper[4849]: E0320 13:42:20.608057 4849 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:42:20 crc kubenswrapper[4849]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ngvwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-qvbj6_openstack(2358ecf4-6327-4cc9-bcc5-c822e2215540): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/2358ecf4-6327-4cc9-bcc5-c822e2215540/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:42:20 crc kubenswrapper[4849]: > logger="UnhandledError" Mar 20 13:42:20 crc kubenswrapper[4849]: E0320 13:42:20.609419 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/2358ecf4-6327-4cc9-bcc5-c822e2215540/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.632497 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-226bs"] Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.645403 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.706594 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.745953 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298817-d1eb-4db5-9783-700b3a312bd7-config\") pod \"ed298817-d1eb-4db5-9783-700b3a312bd7\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.747157 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkt8q\" (UniqueName: \"kubernetes.io/projected/82070c55-8507-4a82-8872-053f269f0cba-kube-api-access-vkt8q\") pod \"82070c55-8507-4a82-8872-053f269f0cba\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.746691 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed298817-d1eb-4db5-9783-700b3a312bd7-config" (OuterVolumeSpecName: "config") pod "ed298817-d1eb-4db5-9783-700b3a312bd7" (UID: "ed298817-d1eb-4db5-9783-700b3a312bd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.747450 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-dns-svc\") pod \"82070c55-8507-4a82-8872-053f269f0cba\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.747562 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8plkk\" (UniqueName: \"kubernetes.io/projected/ed298817-d1eb-4db5-9783-700b3a312bd7-kube-api-access-8plkk\") pod \"ed298817-d1eb-4db5-9783-700b3a312bd7\" (UID: \"ed298817-d1eb-4db5-9783-700b3a312bd7\") " Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.747677 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-config\") pod \"82070c55-8507-4a82-8872-053f269f0cba\" (UID: \"82070c55-8507-4a82-8872-053f269f0cba\") " Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.747956 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82070c55-8507-4a82-8872-053f269f0cba" (UID: "82070c55-8507-4a82-8872-053f269f0cba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.748079 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-config" (OuterVolumeSpecName: "config") pod "82070c55-8507-4a82-8872-053f269f0cba" (UID: "82070c55-8507-4a82-8872-053f269f0cba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.748438 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.748527 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82070c55-8507-4a82-8872-053f269f0cba-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.748638 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed298817-d1eb-4db5-9783-700b3a312bd7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.751366 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82070c55-8507-4a82-8872-053f269f0cba-kube-api-access-vkt8q" (OuterVolumeSpecName: "kube-api-access-vkt8q") pod "82070c55-8507-4a82-8872-053f269f0cba" (UID: "82070c55-8507-4a82-8872-053f269f0cba"). InnerVolumeSpecName "kube-api-access-vkt8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.751487 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed298817-d1eb-4db5-9783-700b3a312bd7-kube-api-access-8plkk" (OuterVolumeSpecName: "kube-api-access-8plkk") pod "ed298817-d1eb-4db5-9783-700b3a312bd7" (UID: "ed298817-d1eb-4db5-9783-700b3a312bd7"). InnerVolumeSpecName "kube-api-access-8plkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.849773 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkt8q\" (UniqueName: \"kubernetes.io/projected/82070c55-8507-4a82-8872-053f269f0cba-kube-api-access-vkt8q\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:20 crc kubenswrapper[4849]: I0320 13:42:20.850084 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8plkk\" (UniqueName: \"kubernetes.io/projected/ed298817-d1eb-4db5-9783-700b3a312bd7-kube-api-access-8plkk\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.218722 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.219995 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.220092 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.227289 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4vkkq" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.227540 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.227962 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.228108 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.256743 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8qpn\" (UniqueName: \"kubernetes.io/projected/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-kube-api-access-n8qpn\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.256860 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.256900 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.257037 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-config\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.257185 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.257272 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.257475 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.257580 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.308192 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9znfj" event={"ID":"f589037a-06aa-452d-82ef-0dbf2177b7fc","Type":"ContainerStarted","Data":"cbbff5da9c91b5c771091d45dcadcdaca171d233e64d39f14156e7fc83bae29f"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.310096 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"464306bd-0d8b-40ca-aa64-1ec5a00a527b","Type":"ContainerStarted","Data":"55a85160adb2116fce5c665c1652e9cdca1301474b48b2c96b6117b671da0ec2"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.314329 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c3c4952-4c22-4389-834c-969b89fb9e20","Type":"ContainerStarted","Data":"d145f502a1654fc592c638ac3f4d7252d0cc5f9404df487223878eb91feb7806"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.315775 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e53df741-614d-449c-8da6-4de0333a6e9b","Type":"ContainerStarted","Data":"4db35a6293e0d0e35cffdf5e0ecf13489fb7744802dfa482dda428a1981b447c"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.320716 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" event={"ID":"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d","Type":"ContainerStarted","Data":"66d46567d825f49b95e07354634d38330f53173a8c0c774896b92ad877f52cee"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.321011 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.324582 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f905722-c565-4fe5-bdde-0df02a23b833","Type":"ContainerStarted","Data":"041e0f25101f842c07e7810e555e6f1a4177a3f132793d1fadf77d4206a795f4"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.333253 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-226bs" event={"ID":"57363bb0-8542-49ea-95b9-84fd9206f644","Type":"ContainerStarted","Data":"bc430eecb5a34daaa51a33217dda43ab7e5422c0cb8c2454701253fcc9db5b6d"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.339650 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" podStartSLOduration=5.834425659 podStartE2EDuration="14.33961146s" podCreationTimestamp="2026-03-20 13:42:07 +0000 UTC" firstStartedPulling="2026-03-20 13:42:11.381158186 +0000 UTC m=+1081.058881571" lastFinishedPulling="2026-03-20 13:42:19.886343977 +0000 UTC m=+1089.564067372" observedRunningTime="2026-03-20 13:42:21.336597447 +0000 UTC m=+1091.014320862" watchObservedRunningTime="2026-03-20 13:42:21.33961146 +0000 UTC m=+1091.017334855" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.343744 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"200bac0a-008a-4528-bb22-3cf6e1ef6342","Type":"ContainerStarted","Data":"30085a81a363286558f0bac01c7ddd0f294085ce185d5f306521dfeffbfd1b49"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.347192 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4ef098b-892c-4619-a5b4-7c10cdf47f9b","Type":"ContainerStarted","Data":"d9b73262bf316ad583eb5d9b246bb482aea1fb38e0b92dec2b18ce3ee95b11fd"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.349635 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" event={"ID":"ed298817-d1eb-4db5-9783-700b3a312bd7","Type":"ContainerDied","Data":"6bbcd513b9079b47d0a17440273a7aa9e004c724f58726cca36e7d62fab28161"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.349671 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bsb54" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.353929 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" event={"ID":"82070c55-8507-4a82-8872-053f269f0cba","Type":"ContainerDied","Data":"03be55d9d184736620d720742fc0700a7651286ab590e8b2b35d67381845a9ac"} Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.354207 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-65z78" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.358874 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.358918 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.358952 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.359025 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8qpn\" (UniqueName: \"kubernetes.io/projected/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-kube-api-access-n8qpn\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.359445 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.360213 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.361332 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.361371 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.361472 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-config\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.362099 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.362303 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.362876 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-config\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.374544 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.377850 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.378465 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8qpn\" (UniqueName: \"kubernetes.io/projected/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-kube-api-access-n8qpn\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.404699 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65z78"] Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.415020 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c5e6b3b-dc09-46d1-aac2-6625c28896fb-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.417781 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9c5e6b3b-dc09-46d1-aac2-6625c28896fb\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.444967 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-65z78"] Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.467730 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bsb54"] Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.485274 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bsb54"] Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.568332 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:21 crc kubenswrapper[4849]: I0320 13:42:21.645589 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:42:22 crc kubenswrapper[4849]: I0320 13:42:22.310117 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:42:22 crc kubenswrapper[4849]: I0320 13:42:22.364757 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"91eeca7c-4c91-4b2f-8541-be7b6a36b582","Type":"ContainerStarted","Data":"ac435cf76baf87407ff0b1c017ccff245fb65e0d49dea32fdf0b4b06b7600441"} Mar 20 13:42:22 crc kubenswrapper[4849]: I0320 13:42:22.368940 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" event={"ID":"2358ecf4-6327-4cc9-bcc5-c822e2215540","Type":"ContainerStarted","Data":"7c3561beff2f73490be09bc5d21ae2b8aff727724cea1cf79aedaaf66c2c2ca9"} Mar 20 13:42:22 crc kubenswrapper[4849]: I0320 13:42:22.369382 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:23 crc kubenswrapper[4849]: I0320 13:42:23.058631 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82070c55-8507-4a82-8872-053f269f0cba" path="/var/lib/kubelet/pods/82070c55-8507-4a82-8872-053f269f0cba/volumes" Mar 20 13:42:23 crc kubenswrapper[4849]: I0320 13:42:23.059091 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed298817-d1eb-4db5-9783-700b3a312bd7" path="/var/lib/kubelet/pods/ed298817-d1eb-4db5-9783-700b3a312bd7/volumes" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.292810 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" podStartSLOduration=9.719828004 podStartE2EDuration="18.292771777s" podCreationTimestamp="2026-03-20 13:42:07 +0000 UTC" firstStartedPulling="2026-03-20 13:42:11.382084921 +0000 UTC m=+1081.059808316" lastFinishedPulling="2026-03-20 13:42:19.955028694 +0000 UTC m=+1089.632752089" observedRunningTime="2026-03-20 13:42:22.388490863 +0000 UTC m=+1092.066214278" watchObservedRunningTime="2026-03-20 13:42:25.292771777 +0000 UTC m=+1094.970495172" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.308165 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bgrqc"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.309577 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.312751 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bgrqc"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.314669 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.428663 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.428737 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-ovs-rundir\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.428796 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqm5g\" (UniqueName: \"kubernetes.io/projected/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-kube-api-access-jqm5g\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.428933 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-combined-ca-bundle\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.428962 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-config\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.429025 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-ovn-rundir\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.445351 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s4zh7"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.445583 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="dnsmasq-dns" containerID="cri-o://66d46567d825f49b95e07354634d38330f53173a8c0c774896b92ad877f52cee" gracePeriod=10 Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.448015 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.477226 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2v75"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.478633 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.483758 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.494363 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2v75"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.530909 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531245 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-ovs-rundir\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531283 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqm5g\" (UniqueName: \"kubernetes.io/projected/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-kube-api-access-jqm5g\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531442 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-combined-ca-bundle\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531488 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-config\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531533 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-ovn-rundir\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531594 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-ovs-rundir\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.531674 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-ovn-rundir\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.532466 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-config\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.547589 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.547610 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-combined-ca-bundle\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.552326 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqm5g\" (UniqueName: \"kubernetes.io/projected/e5dac0d1-f9a7-4671-8fd5-5030df3fc592-kube-api-access-jqm5g\") pod \"ovn-controller-metrics-bgrqc\" (UID: \"e5dac0d1-f9a7-4671-8fd5-5030df3fc592\") " pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.583236 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qvbj6"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.583472 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="dnsmasq-dns" containerID="cri-o://7c3561beff2f73490be09bc5d21ae2b8aff727724cea1cf79aedaaf66c2c2ca9" gracePeriod=10 Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.605135 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rv5md"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.606417 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.612294 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.619120 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rv5md"] Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.633935 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634009 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634050 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634078 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634109 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vkfr\" (UniqueName: \"kubernetes.io/projected/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-kube-api-access-8vkfr\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634135 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plzk\" (UniqueName: \"kubernetes.io/projected/57669291-1fb9-4564-aa80-25c9cdf20aa0-kube-api-access-7plzk\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634199 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-config\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634356 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-config\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.634418 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.641383 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bgrqc" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.738143 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.738219 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.738316 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.739100 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.739169 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vkfr\" (UniqueName: \"kubernetes.io/projected/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-kube-api-access-8vkfr\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.739200 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plzk\" (UniqueName: \"kubernetes.io/projected/57669291-1fb9-4564-aa80-25c9cdf20aa0-kube-api-access-7plzk\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.739273 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-config\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.739306 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-config\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.739346 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.740385 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-config\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.740419 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-config\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.740525 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.740544 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.740619 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.741049 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.741350 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.756621 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plzk\" (UniqueName: \"kubernetes.io/projected/57669291-1fb9-4564-aa80-25c9cdf20aa0-kube-api-access-7plzk\") pod \"dnsmasq-dns-86db49b7ff-rv5md\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.757635 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vkfr\" (UniqueName: \"kubernetes.io/projected/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-kube-api-access-8vkfr\") pod \"dnsmasq-dns-7fd796d7df-b2v75\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.804675 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:25 crc kubenswrapper[4849]: I0320 13:42:25.976379 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:26 crc kubenswrapper[4849]: I0320 13:42:26.440024 4849 generic.go:334] "Generic (PLEG): container finished" podID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerID="7c3561beff2f73490be09bc5d21ae2b8aff727724cea1cf79aedaaf66c2c2ca9" exitCode=0 Mar 20 13:42:26 crc kubenswrapper[4849]: I0320 13:42:26.440097 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" event={"ID":"2358ecf4-6327-4cc9-bcc5-c822e2215540","Type":"ContainerDied","Data":"7c3561beff2f73490be09bc5d21ae2b8aff727724cea1cf79aedaaf66c2c2ca9"} Mar 20 13:42:26 crc kubenswrapper[4849]: I0320 13:42:26.442463 4849 generic.go:334] "Generic (PLEG): container finished" podID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerID="66d46567d825f49b95e07354634d38330f53173a8c0c774896b92ad877f52cee" exitCode=0 Mar 20 13:42:26 crc kubenswrapper[4849]: I0320 13:42:26.442494 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" event={"ID":"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d","Type":"ContainerDied","Data":"66d46567d825f49b95e07354634d38330f53173a8c0c774896b92ad877f52cee"} Mar 20 13:42:27 crc kubenswrapper[4849]: W0320 13:42:27.246175 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c5e6b3b_dc09_46d1_aac2_6625c28896fb.slice/crio-2f8f778cf4f923b7c3ec587f84e58ec8f64ab3008b363ada97774941a5000b64 WatchSource:0}: Error finding container 2f8f778cf4f923b7c3ec587f84e58ec8f64ab3008b363ada97774941a5000b64: Status 404 returned error can't find the container with id 2f8f778cf4f923b7c3ec587f84e58ec8f64ab3008b363ada97774941a5000b64 Mar 20 13:42:27 crc kubenswrapper[4849]: I0320 13:42:27.450796 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9c5e6b3b-dc09-46d1-aac2-6625c28896fb","Type":"ContainerStarted","Data":"2f8f778cf4f923b7c3ec587f84e58ec8f64ab3008b363ada97774941a5000b64"} Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.016679 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.022947 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.077545 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-config\") pod \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.077617 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-dns-svc\") pod \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.077671 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvwt\" (UniqueName: \"kubernetes.io/projected/2358ecf4-6327-4cc9-bcc5-c822e2215540-kube-api-access-ngvwt\") pod \"2358ecf4-6327-4cc9-bcc5-c822e2215540\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.077699 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-dns-svc\") pod \"2358ecf4-6327-4cc9-bcc5-c822e2215540\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.077752 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-config\") pod \"2358ecf4-6327-4cc9-bcc5-c822e2215540\" (UID: \"2358ecf4-6327-4cc9-bcc5-c822e2215540\") " Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.077881 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2mzk\" (UniqueName: \"kubernetes.io/projected/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-kube-api-access-v2mzk\") pod \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\" (UID: \"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d\") " Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.082506 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-kube-api-access-v2mzk" (OuterVolumeSpecName: "kube-api-access-v2mzk") pod "38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" (UID: "38ee0a5e-12f3-41c4-9c82-effb3c4cde0d"). InnerVolumeSpecName "kube-api-access-v2mzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.096058 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2358ecf4-6327-4cc9-bcc5-c822e2215540-kube-api-access-ngvwt" (OuterVolumeSpecName: "kube-api-access-ngvwt") pod "2358ecf4-6327-4cc9-bcc5-c822e2215540" (UID: "2358ecf4-6327-4cc9-bcc5-c822e2215540"). InnerVolumeSpecName "kube-api-access-ngvwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.119945 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-config" (OuterVolumeSpecName: "config") pod "38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" (UID: "38ee0a5e-12f3-41c4-9c82-effb3c4cde0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.120078 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2358ecf4-6327-4cc9-bcc5-c822e2215540" (UID: "2358ecf4-6327-4cc9-bcc5-c822e2215540"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.123345 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-config" (OuterVolumeSpecName: "config") pod "2358ecf4-6327-4cc9-bcc5-c822e2215540" (UID: "2358ecf4-6327-4cc9-bcc5-c822e2215540"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.140724 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" (UID: "38ee0a5e-12f3-41c4-9c82-effb3c4cde0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.179551 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.179583 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.179596 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvwt\" (UniqueName: \"kubernetes.io/projected/2358ecf4-6327-4cc9-bcc5-c822e2215540-kube-api-access-ngvwt\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.179609 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.179619 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2358ecf4-6327-4cc9-bcc5-c822e2215540-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.179629 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2mzk\" (UniqueName: \"kubernetes.io/projected/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d-kube-api-access-v2mzk\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.488112 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" event={"ID":"38ee0a5e-12f3-41c4-9c82-effb3c4cde0d","Type":"ContainerDied","Data":"0e3b8da96699957defe3c14768e177465b2e79ea34734a88bbb2a903e1906720"} Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.488144 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.488169 4849 scope.go:117] "RemoveContainer" containerID="66d46567d825f49b95e07354634d38330f53173a8c0c774896b92ad877f52cee" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.491577 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" event={"ID":"2358ecf4-6327-4cc9-bcc5-c822e2215540","Type":"ContainerDied","Data":"5d6514b3c3b5177224a2b278075f01fcd03f7e19cb6280f672d49dd2d4db61b7"} Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.491664 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.523304 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s4zh7"] Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.533733 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s4zh7"] Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.543987 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qvbj6"] Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.549949 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-qvbj6"] Mar 20 13:42:32 crc kubenswrapper[4849]: I0320 13:42:32.943802 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-qvbj6" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: i/o timeout" Mar 20 13:42:33 crc kubenswrapper[4849]: I0320 13:42:33.038599 4849 scope.go:117] "RemoveContainer" containerID="a187921fc9e6295fb44772d7155a17e1dd78c0c08b5229c24f8089f552f92938" Mar 20 13:42:33 crc kubenswrapper[4849]: I0320 13:42:33.045358 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" path="/var/lib/kubelet/pods/2358ecf4-6327-4cc9-bcc5-c822e2215540/volumes" Mar 20 13:42:33 crc kubenswrapper[4849]: I0320 13:42:33.046136 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" path="/var/lib/kubelet/pods/38ee0a5e-12f3-41c4-9c82-effb3c4cde0d/volumes" Mar 20 13:42:33 crc kubenswrapper[4849]: I0320 13:42:33.159721 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-s4zh7" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: i/o timeout" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.254713 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.255384 4849 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:42:39 crc kubenswrapper[4849]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 13:42:39 crc kubenswrapper[4849]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 13:42:39 crc kubenswrapper[4849]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 13:42:39 crc kubenswrapper[4849]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 13:42:39 crc kubenswrapper[4849]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:42:39 crc kubenswrapper[4849]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:42:39 crc kubenswrapper[4849]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:42:39 crc kubenswrapper[4849]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 13:42:39 crc kubenswrapper[4849]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7mv6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(464306bd-0d8b-40ca-aa64-1ec5a00a527b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 13:42:39 crc kubenswrapper[4849]: > logger="UnhandledError" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.256700 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="464306bd-0d8b-40ca-aa64-1ec5a00a527b" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.423860 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.424435 4849 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:42:39 crc kubenswrapper[4849]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 13:42:39 crc kubenswrapper[4849]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 13:42:39 crc kubenswrapper[4849]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 13:42:39 crc kubenswrapper[4849]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 13:42:39 crc kubenswrapper[4849]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:42:39 crc kubenswrapper[4849]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:42:39 crc kubenswrapper[4849]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:42:39 crc kubenswrapper[4849]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 13:42:39 crc kubenswrapper[4849]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxh84,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(3c3c4952-4c22-4389-834c-969b89fb9e20): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 13:42:39 crc kubenswrapper[4849]: > logger="UnhandledError" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.425583 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3c3c4952-4c22-4389-834c-969b89fb9e20" Mar 20 13:42:39 crc kubenswrapper[4849]: I0320 13:42:39.482492 4849 scope.go:117] "RemoveContainer" containerID="21a13c4d8384be90d2e2786b5b8ff8d7c438c4a07a60b64f355f5e37309b8d47" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.542744 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="3c3c4952-4c22-4389-834c-969b89fb9e20" Mar 20 13:42:39 crc kubenswrapper[4849]: E0320 13:42:39.543385 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="464306bd-0d8b-40ca-aa64-1ec5a00a527b" Mar 20 13:42:39 crc kubenswrapper[4849]: I0320 13:42:39.872362 4849 scope.go:117] "RemoveContainer" containerID="7c3561beff2f73490be09bc5d21ae2b8aff727724cea1cf79aedaaf66c2c2ca9" Mar 20 13:42:39 crc kubenswrapper[4849]: I0320 13:42:39.955149 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bgrqc"] Mar 20 13:42:39 crc kubenswrapper[4849]: I0320 13:42:39.990548 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rv5md"] Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.000583 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2v75"] Mar 20 13:42:40 crc kubenswrapper[4849]: W0320 13:42:40.195702 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d6d2c23_0390_4dd6_ac4f_e2840dfcc441.slice/crio-7817aa5255bd37af87b1aeb9d0b8cec52f2b20a35bc634b8ee397671faa4b718 WatchSource:0}: Error finding container 7817aa5255bd37af87b1aeb9d0b8cec52f2b20a35bc634b8ee397671faa4b718: Status 404 returned error can't find the container with id 7817aa5255bd37af87b1aeb9d0b8cec52f2b20a35bc634b8ee397671faa4b718 Mar 20 13:42:40 crc kubenswrapper[4849]: E0320 13:42:40.534559 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 13:42:40 crc kubenswrapper[4849]: E0320 13:42:40.534934 4849 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Mar 20 13:42:40 crc kubenswrapper[4849]: E0320 13:42:40.535079 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hqpqr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(e53df741-614d-449c-8da6-4de0333a6e9b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:42:40 crc kubenswrapper[4849]: E0320 13:42:40.536274 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.548746 4849 scope.go:117] "RemoveContainer" containerID="c55d51de1e429f7a6122819d9c859c2241d88ee921f1b150214002d0bb1ad1fc" Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.549805 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" event={"ID":"57669291-1fb9-4564-aa80-25c9cdf20aa0","Type":"ContainerStarted","Data":"8a5c19ed274eae02c06beec4d57726efe0d954b2baacc25ec0db3d12d7b2a3fe"} Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.551112 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bgrqc" event={"ID":"e5dac0d1-f9a7-4671-8fd5-5030df3fc592","Type":"ContainerStarted","Data":"aaa6aad734a48f6e2ea1d718c7d3f7668033b69897ea4a5efda324e5f3a42df8"} Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.553735 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-226bs" event={"ID":"57363bb0-8542-49ea-95b9-84fd9206f644","Type":"ContainerStarted","Data":"1efcdc2bf0d8190f254e84b0aefa815ed546cf46e7bbd740e019ba3e3afff286"} Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.555255 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" event={"ID":"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441","Type":"ContainerStarted","Data":"7817aa5255bd37af87b1aeb9d0b8cec52f2b20a35bc634b8ee397671faa4b718"} Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.557175 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"200bac0a-008a-4528-bb22-3cf6e1ef6342","Type":"ContainerStarted","Data":"2ab0ee8ce5728dbf6c5d93def3240027ad94a72cf777be207f868f615dd0641a"} Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.557391 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 13:42:40 crc kubenswrapper[4849]: E0320 13:42:40.577608 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" Mar 20 13:42:40 crc kubenswrapper[4849]: I0320 13:42:40.615903 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.191013004 podStartE2EDuration="29.615883091s" podCreationTimestamp="2026-03-20 13:42:11 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.49980458 +0000 UTC m=+1090.177527975" lastFinishedPulling="2026-03-20 13:42:31.924674657 +0000 UTC m=+1101.602398062" observedRunningTime="2026-03-20 13:42:40.612062637 +0000 UTC m=+1110.289786062" watchObservedRunningTime="2026-03-20 13:42:40.615883091 +0000 UTC m=+1110.293606486" Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.573209 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f905722-c565-4fe5-bdde-0df02a23b833","Type":"ContainerStarted","Data":"19aeb45be188820fb0fe8151fc10551b9db4b3f0c9534b1205c7236ea44cb83d"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.576452 4849 generic.go:334] "Generic (PLEG): container finished" podID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerID="e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de" exitCode=0 Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.576528 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" event={"ID":"57669291-1fb9-4564-aa80-25c9cdf20aa0","Type":"ContainerDied","Data":"e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.582033 4849 generic.go:334] "Generic (PLEG): container finished" podID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerID="80a49a9b84ced589e9217f2987bbe1f1a7ff0087f2c8a5a6027f71f06b3becc5" exitCode=0 Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.582239 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" event={"ID":"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441","Type":"ContainerDied","Data":"80a49a9b84ced589e9217f2987bbe1f1a7ff0087f2c8a5a6027f71f06b3becc5"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.589759 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9znfj" event={"ID":"f589037a-06aa-452d-82ef-0dbf2177b7fc","Type":"ContainerStarted","Data":"7d44ab363ed89b4208ebde19c091f215c431f4e5a35c129c6d7c804370b4af51"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.590049 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9znfj" Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.594235 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"91eeca7c-4c91-4b2f-8541-be7b6a36b582","Type":"ContainerStarted","Data":"1852022fb239c0fc266d12b8cf5f8e98f450a7869bcb752e95841637e9bb7dd4"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.602873 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4ef098b-892c-4619-a5b4-7c10cdf47f9b","Type":"ContainerStarted","Data":"ff48ec613b066c483243a525511a1770218e85a4e19124bbcb3ec69cfe76c816"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.616638 4849 generic.go:334] "Generic (PLEG): container finished" podID="57363bb0-8542-49ea-95b9-84fd9206f644" containerID="1efcdc2bf0d8190f254e84b0aefa815ed546cf46e7bbd740e019ba3e3afff286" exitCode=0 Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.616700 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-226bs" event={"ID":"57363bb0-8542-49ea-95b9-84fd9206f644","Type":"ContainerDied","Data":"1efcdc2bf0d8190f254e84b0aefa815ed546cf46e7bbd740e019ba3e3afff286"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.629585 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9c5e6b3b-dc09-46d1-aac2-6625c28896fb","Type":"ContainerStarted","Data":"36c79a84d2df57674af7eb165079ffbc4eda5b10e0d50fbc05e901591f2d4edd"} Mar 20 13:42:41 crc kubenswrapper[4849]: I0320 13:42:41.678926 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9znfj" podStartSLOduration=5.763384126 podStartE2EDuration="24.678905529s" podCreationTimestamp="2026-03-20 13:42:17 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.555025099 +0000 UTC m=+1090.232748494" lastFinishedPulling="2026-03-20 13:42:39.470546502 +0000 UTC m=+1109.148269897" observedRunningTime="2026-03-20 13:42:41.671941519 +0000 UTC m=+1111.349664954" watchObservedRunningTime="2026-03-20 13:42:41.678905529 +0000 UTC m=+1111.356628924" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.690872 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" event={"ID":"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441","Type":"ContainerStarted","Data":"ec71cef01f638708b5847823616aabfe30880f20124f43c3f5edefc82883b97d"} Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.691420 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.694275 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" event={"ID":"57669291-1fb9-4564-aa80-25c9cdf20aa0","Type":"ContainerStarted","Data":"8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05"} Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.695103 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.698759 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-226bs" event={"ID":"57363bb0-8542-49ea-95b9-84fd9206f644","Type":"ContainerStarted","Data":"3d6aab6a4fcde397ad2012c3c4a46a4c9484e8ce82b3c1b0098741e7c68ee17b"} Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.698790 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-226bs" event={"ID":"57363bb0-8542-49ea-95b9-84fd9206f644","Type":"ContainerStarted","Data":"40abec35f5e555260c90a7c2b1aa3069c3a0a7e0c96f4155b66aa18727ae9f69"} Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.698969 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.699078 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.719343 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" podStartSLOduration=17.719325331 podStartE2EDuration="17.719325331s" podCreationTimestamp="2026-03-20 13:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:42.711744264 +0000 UTC m=+1112.389467689" watchObservedRunningTime="2026-03-20 13:42:42.719325331 +0000 UTC m=+1112.397048726" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.734720 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-226bs" podStartSLOduration=14.475813390999999 podStartE2EDuration="25.734700391s" podCreationTimestamp="2026-03-20 13:42:17 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.665779326 +0000 UTC m=+1090.343502721" lastFinishedPulling="2026-03-20 13:42:31.924666326 +0000 UTC m=+1101.602389721" observedRunningTime="2026-03-20 13:42:42.734242448 +0000 UTC m=+1112.411965863" watchObservedRunningTime="2026-03-20 13:42:42.734700391 +0000 UTC m=+1112.412423786" Mar 20 13:42:42 crc kubenswrapper[4849]: I0320 13:42:42.751320 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" podStartSLOduration=17.751303465 podStartE2EDuration="17.751303465s" podCreationTimestamp="2026-03-20 13:42:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:42.748967881 +0000 UTC m=+1112.426691296" watchObservedRunningTime="2026-03-20 13:42:42.751303465 +0000 UTC m=+1112.429026860" Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.720359 4849 generic.go:334] "Generic (PLEG): container finished" podID="4f905722-c565-4fe5-bdde-0df02a23b833" containerID="19aeb45be188820fb0fe8151fc10551b9db4b3f0c9534b1205c7236ea44cb83d" exitCode=0 Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.720440 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f905722-c565-4fe5-bdde-0df02a23b833","Type":"ContainerDied","Data":"19aeb45be188820fb0fe8151fc10551b9db4b3f0c9534b1205c7236ea44cb83d"} Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.724514 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bgrqc" event={"ID":"e5dac0d1-f9a7-4671-8fd5-5030df3fc592","Type":"ContainerStarted","Data":"0ba259463ac12c6e2fccc4567126a461c7a51b65c3da3b6023902c5f5089f623"} Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.732061 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9c5e6b3b-dc09-46d1-aac2-6625c28896fb","Type":"ContainerStarted","Data":"de256d1d3bf699fce850d83191e78a09ff517218aee7a30818910dc1e6ddb772"} Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.734843 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"91eeca7c-4c91-4b2f-8541-be7b6a36b582","Type":"ContainerStarted","Data":"9d5170f696183854896b924432e17b7a7aae83584d1222ded8b9f8bd15dde1dc"} Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.736326 4849 generic.go:334] "Generic (PLEG): container finished" podID="b4ef098b-892c-4619-a5b4-7c10cdf47f9b" containerID="ff48ec613b066c483243a525511a1770218e85a4e19124bbcb3ec69cfe76c816" exitCode=0 Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.736386 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4ef098b-892c-4619-a5b4-7c10cdf47f9b","Type":"ContainerDied","Data":"ff48ec613b066c483243a525511a1770218e85a4e19124bbcb3ec69cfe76c816"} Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.817656 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.987651945 podStartE2EDuration="29.817635768s" podCreationTimestamp="2026-03-20 13:42:16 +0000 UTC" firstStartedPulling="2026-03-20 13:42:21.698196899 +0000 UTC m=+1091.375920294" lastFinishedPulling="2026-03-20 13:42:44.528180722 +0000 UTC m=+1114.205904117" observedRunningTime="2026-03-20 13:42:45.80852955 +0000 UTC m=+1115.486252975" watchObservedRunningTime="2026-03-20 13:42:45.817635768 +0000 UTC m=+1115.495359163" Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.832897 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bgrqc" podStartSLOduration=16.458005253 podStartE2EDuration="20.832878465s" podCreationTimestamp="2026-03-20 13:42:25 +0000 UTC" firstStartedPulling="2026-03-20 13:42:40.187460313 +0000 UTC m=+1109.865183708" lastFinishedPulling="2026-03-20 13:42:44.562333525 +0000 UTC m=+1114.240056920" observedRunningTime="2026-03-20 13:42:45.831562619 +0000 UTC m=+1115.509286034" watchObservedRunningTime="2026-03-20 13:42:45.832878465 +0000 UTC m=+1115.510601870" Mar 20 13:42:45 crc kubenswrapper[4849]: I0320 13:42:45.873036 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.533922658 podStartE2EDuration="25.873013172s" podCreationTimestamp="2026-03-20 13:42:20 +0000 UTC" firstStartedPulling="2026-03-20 13:42:27.249956641 +0000 UTC m=+1096.927680076" lastFinishedPulling="2026-03-20 13:42:44.589047195 +0000 UTC m=+1114.266770590" observedRunningTime="2026-03-20 13:42:45.869143396 +0000 UTC m=+1115.546866811" watchObservedRunningTime="2026-03-20 13:42:45.873013172 +0000 UTC m=+1115.550736587" Mar 20 13:42:46 crc kubenswrapper[4849]: I0320 13:42:46.568501 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:46 crc kubenswrapper[4849]: I0320 13:42:46.744410 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b4ef098b-892c-4619-a5b4-7c10cdf47f9b","Type":"ContainerStarted","Data":"494963acbe347e72db4d6dda826179c9aed719358cc436e67e041d3886472a66"} Mar 20 13:42:46 crc kubenswrapper[4849]: I0320 13:42:46.747321 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4f905722-c565-4fe5-bdde-0df02a23b833","Type":"ContainerStarted","Data":"80c243dd384397e3299b53d4cdb163d20fc4a1e667176c9c328169718ea3bdd2"} Mar 20 13:42:46 crc kubenswrapper[4849]: I0320 13:42:46.765779 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=18.804071354 podStartE2EDuration="37.765757598s" podCreationTimestamp="2026-03-20 13:42:09 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.406026548 +0000 UTC m=+1090.083749943" lastFinishedPulling="2026-03-20 13:42:39.367712792 +0000 UTC m=+1109.045436187" observedRunningTime="2026-03-20 13:42:46.763024203 +0000 UTC m=+1116.440747658" watchObservedRunningTime="2026-03-20 13:42:46.765757598 +0000 UTC m=+1116.443481023" Mar 20 13:42:46 crc kubenswrapper[4849]: I0320 13:42:46.789403 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.513432551 podStartE2EDuration="36.789383063s" podCreationTimestamp="2026-03-20 13:42:10 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.424949765 +0000 UTC m=+1090.102673160" lastFinishedPulling="2026-03-20 13:42:39.700900277 +0000 UTC m=+1109.378623672" observedRunningTime="2026-03-20 13:42:46.783951205 +0000 UTC m=+1116.461674680" watchObservedRunningTime="2026-03-20 13:42:46.789383063 +0000 UTC m=+1116.467106458" Mar 20 13:42:47 crc kubenswrapper[4849]: I0320 13:42:47.138328 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.005316 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.005620 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.045243 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.569117 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.616323 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.794249 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 13:42:48 crc kubenswrapper[4849]: I0320 13:42:48.804773 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.128607 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:42:49 crc kubenswrapper[4849]: E0320 13:42:49.128965 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="dnsmasq-dns" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.128978 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="dnsmasq-dns" Mar 20 13:42:49 crc kubenswrapper[4849]: E0320 13:42:49.129017 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="dnsmasq-dns" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.129023 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="dnsmasq-dns" Mar 20 13:42:49 crc kubenswrapper[4849]: E0320 13:42:49.129035 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="init" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.129041 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="init" Mar 20 13:42:49 crc kubenswrapper[4849]: E0320 13:42:49.129056 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="init" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.129062 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="init" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.129195 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ee0a5e-12f3-41c4-9c82-effb3c4cde0d" containerName="dnsmasq-dns" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.129211 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="2358ecf4-6327-4cc9-bcc5-c822e2215540" containerName="dnsmasq-dns" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.129962 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.131730 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.132900 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.132933 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-tssk9" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.133136 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.151606 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188233 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188278 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwf2\" (UniqueName: \"kubernetes.io/projected/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-kube-api-access-6fwf2\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188329 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188356 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-scripts\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188379 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-config\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188395 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.188477 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.289746 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.289815 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwf2\" (UniqueName: \"kubernetes.io/projected/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-kube-api-access-6fwf2\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.289930 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.289970 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-scripts\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.290005 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-config\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.290023 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.290038 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.291002 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-scripts\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.291097 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.291243 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-config\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.295650 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.295799 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.311138 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.313777 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwf2\" (UniqueName: \"kubernetes.io/projected/fad9aa5d-a9a2-40b5-a51b-9ff1f934844f-kube-api-access-6fwf2\") pod \"ovn-northd-0\" (UID: \"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f\") " pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.457843 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:42:49 crc kubenswrapper[4849]: I0320 13:42:49.883861 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:42:49 crc kubenswrapper[4849]: W0320 13:42:49.895186 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfad9aa5d_a9a2_40b5_a51b_9ff1f934844f.slice/crio-a8f82b4a20632717e3e3fceb92855caf868e158526df298e796078ea8a8d2791 WatchSource:0}: Error finding container a8f82b4a20632717e3e3fceb92855caf868e158526df298e796078ea8a8d2791: Status 404 returned error can't find the container with id a8f82b4a20632717e3e3fceb92855caf868e158526df298e796078ea8a8d2791 Mar 20 13:42:50 crc kubenswrapper[4849]: I0320 13:42:50.490682 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 13:42:50 crc kubenswrapper[4849]: I0320 13:42:50.490743 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 13:42:50 crc kubenswrapper[4849]: I0320 13:42:50.773532 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f","Type":"ContainerStarted","Data":"a8f82b4a20632717e3e3fceb92855caf868e158526df298e796078ea8a8d2791"} Mar 20 13:42:50 crc kubenswrapper[4849]: I0320 13:42:50.806981 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:50 crc kubenswrapper[4849]: I0320 13:42:50.978384 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.094756 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2v75"] Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.545351 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.617233 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b4ef098b-892c-4619-a5b4-7c10cdf47f9b" containerName="galera" probeResult="failure" output=< Mar 20 13:42:51 crc kubenswrapper[4849]: wsrep_local_state_comment (Joined) differs from Synced Mar 20 13:42:51 crc kubenswrapper[4849]: > Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.779851 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerName="dnsmasq-dns" containerID="cri-o://ec71cef01f638708b5847823616aabfe30880f20124f43c3f5edefc82883b97d" gracePeriod=10 Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.843926 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.843991 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:51 crc kubenswrapper[4849]: I0320 13:42:51.934660 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:52 crc kubenswrapper[4849]: I0320 13:42:52.790958 4849 generic.go:334] "Generic (PLEG): container finished" podID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerID="ec71cef01f638708b5847823616aabfe30880f20124f43c3f5edefc82883b97d" exitCode=0 Mar 20 13:42:52 crc kubenswrapper[4849]: I0320 13:42:52.791022 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" event={"ID":"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441","Type":"ContainerDied","Data":"ec71cef01f638708b5847823616aabfe30880f20124f43c3f5edefc82883b97d"} Mar 20 13:42:52 crc kubenswrapper[4849]: I0320 13:42:52.881295 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.414021 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.564301 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-dns-svc\") pod \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.564388 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-ovsdbserver-nb\") pod \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.564485 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vkfr\" (UniqueName: \"kubernetes.io/projected/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-kube-api-access-8vkfr\") pod \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.564521 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-config\") pod \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\" (UID: \"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441\") " Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.572098 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-kube-api-access-8vkfr" (OuterVolumeSpecName: "kube-api-access-8vkfr") pod "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" (UID: "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441"). InnerVolumeSpecName "kube-api-access-8vkfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.667033 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vkfr\" (UniqueName: \"kubernetes.io/projected/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-kube-api-access-8vkfr\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.800575 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f","Type":"ContainerStarted","Data":"a5ce3f33d9b9da5b74374d743266a4ece95b767d8f1d32749c2362cddd3a5fac"} Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.800679 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fad9aa5d-a9a2-40b5-a51b-9ff1f934844f","Type":"ContainerStarted","Data":"3b35fd5c6b70fedd786446f60760cad39d0443f31490cac64483243a1cf6c96c"} Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.800735 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.804467 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e53df741-614d-449c-8da6-4de0333a6e9b","Type":"ContainerStarted","Data":"41a65b89ba16e1797ddad54421f40dcfa1aa9bcc02eb9ea68e86610033152b85"} Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.804720 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.807160 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.807180 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-b2v75" event={"ID":"1d6d2c23-0390-4dd6-ac4f-e2840dfcc441","Type":"ContainerDied","Data":"7817aa5255bd37af87b1aeb9d0b8cec52f2b20a35bc634b8ee397671faa4b718"} Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.807259 4849 scope.go:117] "RemoveContainer" containerID="ec71cef01f638708b5847823616aabfe30880f20124f43c3f5edefc82883b97d" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.813506 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-config" (OuterVolumeSpecName: "config") pod "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" (UID: "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.824955 4849 scope.go:117] "RemoveContainer" containerID="80a49a9b84ced589e9217f2987bbe1f1a7ff0087f2c8a5a6027f71f06b3becc5" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.831447 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.414305892 podStartE2EDuration="4.831429281s" podCreationTimestamp="2026-03-20 13:42:49 +0000 UTC" firstStartedPulling="2026-03-20 13:42:49.897083677 +0000 UTC m=+1119.574807073" lastFinishedPulling="2026-03-20 13:42:53.314207067 +0000 UTC m=+1122.991930462" observedRunningTime="2026-03-20 13:42:53.829452197 +0000 UTC m=+1123.507175612" watchObservedRunningTime="2026-03-20 13:42:53.831429281 +0000 UTC m=+1123.509152676" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.864388 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=7.927388135 podStartE2EDuration="40.864360851s" podCreationTimestamp="2026-03-20 13:42:13 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.554946687 +0000 UTC m=+1090.232670082" lastFinishedPulling="2026-03-20 13:42:53.491919403 +0000 UTC m=+1123.169642798" observedRunningTime="2026-03-20 13:42:53.853222577 +0000 UTC m=+1123.530945992" watchObservedRunningTime="2026-03-20 13:42:53.864360851 +0000 UTC m=+1123.542084286" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.869309 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.907194 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" (UID: "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.916695 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" (UID: "1d6d2c23-0390-4dd6-ac4f-e2840dfcc441"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.971451 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:53 crc kubenswrapper[4849]: I0320 13:42:53.971784 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.149250 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2v75"] Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.158434 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-b2v75"] Mar 20 13:42:54 crc kubenswrapper[4849]: E0320 13:42:54.247361 4849 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d6d2c23_0390_4dd6_ac4f_e2840dfcc441.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.400029 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhdr6"] Mar 20 13:42:54 crc kubenswrapper[4849]: E0320 13:42:54.400582 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerName="init" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.400597 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerName="init" Mar 20 13:42:54 crc kubenswrapper[4849]: E0320 13:42:54.400644 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerName="dnsmasq-dns" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.400651 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerName="dnsmasq-dns" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.400792 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" containerName="dnsmasq-dns" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.402775 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.422894 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhdr6"] Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.479487 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-config\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.479534 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.479573 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.479687 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-dns-svc\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.479796 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2sr\" (UniqueName: \"kubernetes.io/projected/959c8c00-cbce-41a1-8b5b-85d5885bda82-kube-api-access-rp2sr\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.581747 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-config\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.581797 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.581856 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.581887 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-dns-svc\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.581916 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2sr\" (UniqueName: \"kubernetes.io/projected/959c8c00-cbce-41a1-8b5b-85d5885bda82-kube-api-access-rp2sr\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.583073 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.583188 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.583198 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-dns-svc\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.583706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-config\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.606143 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2sr\" (UniqueName: \"kubernetes.io/projected/959c8c00-cbce-41a1-8b5b-85d5885bda82-kube-api-access-rp2sr\") pod \"dnsmasq-dns-698758b865-qhdr6\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.724297 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.822031 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"464306bd-0d8b-40ca-aa64-1ec5a00a527b","Type":"ContainerStarted","Data":"b1dd5e5fd8b797ef0e76485804a0026a2df6033f7edc7ea0a4bb051ae008a98a"} Mar 20 13:42:54 crc kubenswrapper[4849]: I0320 13:42:54.827894 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c3c4952-4c22-4389-834c-969b89fb9e20","Type":"ContainerStarted","Data":"5156b08be662746166584ffb769bec7d11452c92898f57b2a574d5ca7c44f253"} Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.049696 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6d2c23-0390-4dd6-ac4f-e2840dfcc441" path="/var/lib/kubelet/pods/1d6d2c23-0390-4dd6-ac4f-e2840dfcc441/volumes" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.198144 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhdr6"] Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.526276 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.541916 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.542175 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.544789 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.544866 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.544869 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.545054 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-sx89g" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.596724 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.596791 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/189da4ab-90d9-4761-b94e-77f30a025385-cache\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.596858 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.596885 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189da4ab-90d9-4761-b94e-77f30a025385-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.597110 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/189da4ab-90d9-4761-b94e-77f30a025385-lock\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.597139 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq557\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-kube-api-access-bq557\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699045 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699116 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/189da4ab-90d9-4761-b94e-77f30a025385-cache\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699156 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699174 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189da4ab-90d9-4761-b94e-77f30a025385-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699233 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/189da4ab-90d9-4761-b94e-77f30a025385-lock\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699254 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq557\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-kube-api-access-bq557\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699326 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: E0320 13:42:55.699340 4849 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:42:55 crc kubenswrapper[4849]: E0320 13:42:55.699373 4849 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:42:55 crc kubenswrapper[4849]: E0320 13:42:55.699442 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift podName:189da4ab-90d9-4761-b94e-77f30a025385 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:56.199420428 +0000 UTC m=+1125.877143893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift") pod "swift-storage-0" (UID: "189da4ab-90d9-4761-b94e-77f30a025385") : configmap "swift-ring-files" not found Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.699689 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/189da4ab-90d9-4761-b94e-77f30a025385-cache\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.700310 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/189da4ab-90d9-4761-b94e-77f30a025385-lock\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.703967 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/189da4ab-90d9-4761-b94e-77f30a025385-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.717768 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq557\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-kube-api-access-bq557\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.719497 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.838421 4849 generic.go:334] "Generic (PLEG): container finished" podID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerID="311477fbb4048d73b18530d45a84270d4c482e1d4f9ffd34ef9d17fdc4323edf" exitCode=0 Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.838471 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhdr6" event={"ID":"959c8c00-cbce-41a1-8b5b-85d5885bda82","Type":"ContainerDied","Data":"311477fbb4048d73b18530d45a84270d4c482e1d4f9ffd34ef9d17fdc4323edf"} Mar 20 13:42:55 crc kubenswrapper[4849]: I0320 13:42:55.838503 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhdr6" event={"ID":"959c8c00-cbce-41a1-8b5b-85d5885bda82","Type":"ContainerStarted","Data":"e78d914b40c9d4b6bae88daa9dc8fff2a1b517107b09183856b86c6b919c9c1c"} Mar 20 13:42:56 crc kubenswrapper[4849]: I0320 13:42:56.207340 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:56 crc kubenswrapper[4849]: E0320 13:42:56.207560 4849 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:42:56 crc kubenswrapper[4849]: E0320 13:42:56.207600 4849 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:42:56 crc kubenswrapper[4849]: E0320 13:42:56.207663 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift podName:189da4ab-90d9-4761-b94e-77f30a025385 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:57.207646905 +0000 UTC m=+1126.885370290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift") pod "swift-storage-0" (UID: "189da4ab-90d9-4761-b94e-77f30a025385") : configmap "swift-ring-files" not found Mar 20 13:42:56 crc kubenswrapper[4849]: I0320 13:42:56.859451 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhdr6" event={"ID":"959c8c00-cbce-41a1-8b5b-85d5885bda82","Type":"ContainerStarted","Data":"6491f273c292616d8b86ff0b8c53113303e592f9a04760b6e490dca0bb172dd7"} Mar 20 13:42:56 crc kubenswrapper[4849]: I0320 13:42:56.859599 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:42:56 crc kubenswrapper[4849]: I0320 13:42:56.877324 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-qhdr6" podStartSLOduration=2.877300045 podStartE2EDuration="2.877300045s" podCreationTimestamp="2026-03-20 13:42:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:56.875351182 +0000 UTC m=+1126.553074597" watchObservedRunningTime="2026-03-20 13:42:56.877300045 +0000 UTC m=+1126.555023460" Mar 20 13:42:57 crc kubenswrapper[4849]: I0320 13:42:57.227595 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:57 crc kubenswrapper[4849]: E0320 13:42:57.227860 4849 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:42:57 crc kubenswrapper[4849]: E0320 13:42:57.227903 4849 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:42:57 crc kubenswrapper[4849]: E0320 13:42:57.227983 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift podName:189da4ab-90d9-4761-b94e-77f30a025385 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:59.227958578 +0000 UTC m=+1128.905681983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift") pod "swift-storage-0" (UID: "189da4ab-90d9-4761-b94e-77f30a025385") : configmap "swift-ring-files" not found Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.264792 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:42:59 crc kubenswrapper[4849]: E0320 13:42:59.265106 4849 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:42:59 crc kubenswrapper[4849]: E0320 13:42:59.265322 4849 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:42:59 crc kubenswrapper[4849]: E0320 13:42:59.265429 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift podName:189da4ab-90d9-4761-b94e-77f30a025385 nodeName:}" failed. No retries permitted until 2026-03-20 13:43:03.265392575 +0000 UTC m=+1132.943115970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift") pod "swift-storage-0" (UID: "189da4ab-90d9-4761-b94e-77f30a025385") : configmap "swift-ring-files" not found Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.487333 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-pmrvl"] Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.488348 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.490388 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.490845 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.491142 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.495710 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pmrvl"] Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569385 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-scripts\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569666 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-ring-data-devices\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569752 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-dispersionconf\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569834 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhkv\" (UniqueName: \"kubernetes.io/projected/07ad4563-bfe9-462b-8191-f21c950281df-kube-api-access-dfhkv\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569857 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-swiftconf\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569873 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-combined-ca-bundle\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.569959 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ad4563-bfe9-462b-8191-f21c950281df-etc-swift\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.671668 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ad4563-bfe9-462b-8191-f21c950281df-etc-swift\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.671750 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-scripts\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.671888 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-ring-data-devices\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.671941 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-dispersionconf\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.672171 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhkv\" (UniqueName: \"kubernetes.io/projected/07ad4563-bfe9-462b-8191-f21c950281df-kube-api-access-dfhkv\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.672207 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-swiftconf\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.672235 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-combined-ca-bundle\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.672440 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ad4563-bfe9-462b-8191-f21c950281df-etc-swift\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.672892 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-ring-data-devices\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.673053 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-scripts\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.677999 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-dispersionconf\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.678040 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-swiftconf\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.679482 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-combined-ca-bundle\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.695408 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhkv\" (UniqueName: \"kubernetes.io/projected/07ad4563-bfe9-462b-8191-f21c950281df-kube-api-access-dfhkv\") pod \"swift-ring-rebalance-pmrvl\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:42:59 crc kubenswrapper[4849]: I0320 13:42:59.807864 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:43:00 crc kubenswrapper[4849]: W0320 13:43:00.232530 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ad4563_bfe9_462b_8191_f21c950281df.slice/crio-a92503785b2bfc91fcf1f7cd7efa309483058c893160c2d0544d539dc2b8ec3a WatchSource:0}: Error finding container a92503785b2bfc91fcf1f7cd7efa309483058c893160c2d0544d539dc2b8ec3a: Status 404 returned error can't find the container with id a92503785b2bfc91fcf1f7cd7efa309483058c893160c2d0544d539dc2b8ec3a Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.239275 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-pmrvl"] Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.558595 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tdfz5"] Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.562519 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.569225 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tdfz5"] Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.569459 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.603159 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.691482 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-operator-scripts\") pod \"root-account-create-update-tdfz5\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.691612 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftkdl\" (UniqueName: \"kubernetes.io/projected/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-kube-api-access-ftkdl\") pod \"root-account-create-update-tdfz5\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.794332 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-operator-scripts\") pod \"root-account-create-update-tdfz5\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.794383 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftkdl\" (UniqueName: \"kubernetes.io/projected/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-kube-api-access-ftkdl\") pod \"root-account-create-update-tdfz5\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.797268 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-operator-scripts\") pod \"root-account-create-update-tdfz5\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.813915 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftkdl\" (UniqueName: \"kubernetes.io/projected/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-kube-api-access-ftkdl\") pod \"root-account-create-update-tdfz5\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.881609 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:00 crc kubenswrapper[4849]: I0320 13:43:00.895127 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pmrvl" event={"ID":"07ad4563-bfe9-462b-8191-f21c950281df","Type":"ContainerStarted","Data":"a92503785b2bfc91fcf1f7cd7efa309483058c893160c2d0544d539dc2b8ec3a"} Mar 20 13:43:01 crc kubenswrapper[4849]: I0320 13:43:01.432945 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tdfz5"] Mar 20 13:43:01 crc kubenswrapper[4849]: I0320 13:43:01.908699 4849 generic.go:334] "Generic (PLEG): container finished" podID="a8ad0333-bcd8-4886-8d77-6741e82b8f3b" containerID="e9c29193f3b4ae7ba05fc46ceb0ce59a15cc3082f4b41e5226357a6f7dbc60a9" exitCode=0 Mar 20 13:43:01 crc kubenswrapper[4849]: I0320 13:43:01.908848 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdfz5" event={"ID":"a8ad0333-bcd8-4886-8d77-6741e82b8f3b","Type":"ContainerDied","Data":"e9c29193f3b4ae7ba05fc46ceb0ce59a15cc3082f4b41e5226357a6f7dbc60a9"} Mar 20 13:43:01 crc kubenswrapper[4849]: I0320 13:43:01.909160 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdfz5" event={"ID":"a8ad0333-bcd8-4886-8d77-6741e82b8f3b","Type":"ContainerStarted","Data":"847f544fbc8e978ed1d1f3dea1b9a425314fe830f42365fbb96944eba36ad4e4"} Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.193740 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qft7p"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.195235 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.200930 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qft7p"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.301095 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0a99-account-create-update-8k85v"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.302164 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.304043 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.310586 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a99-account-create-update-8k85v"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.329724 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrlp2\" (UniqueName: \"kubernetes.io/projected/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-kube-api-access-vrlp2\") pod \"keystone-db-create-qft7p\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.329806 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.329850 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-operator-scripts\") pod \"keystone-db-create-qft7p\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: E0320 13:43:03.329996 4849 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:03 crc kubenswrapper[4849]: E0320 13:43:03.330009 4849 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:03 crc kubenswrapper[4849]: E0320 13:43:03.330041 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift podName:189da4ab-90d9-4761-b94e-77f30a025385 nodeName:}" failed. No retries permitted until 2026-03-20 13:43:11.330029158 +0000 UTC m=+1141.007752553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift") pod "swift-storage-0" (UID: "189da4ab-90d9-4761-b94e-77f30a025385") : configmap "swift-ring-files" not found Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.398581 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-pjvhk"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.399803 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.406793 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pjvhk"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.431327 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-operator-scripts\") pod \"keystone-db-create-qft7p\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.431432 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4t7m\" (UniqueName: \"kubernetes.io/projected/481f5fb9-0040-4372-96b7-15e549dab23a-kube-api-access-x4t7m\") pod \"keystone-0a99-account-create-update-8k85v\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.431490 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrlp2\" (UniqueName: \"kubernetes.io/projected/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-kube-api-access-vrlp2\") pod \"keystone-db-create-qft7p\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.431540 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481f5fb9-0040-4372-96b7-15e549dab23a-operator-scripts\") pod \"keystone-0a99-account-create-update-8k85v\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.432374 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-operator-scripts\") pod \"keystone-db-create-qft7p\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.452719 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrlp2\" (UniqueName: \"kubernetes.io/projected/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-kube-api-access-vrlp2\") pod \"keystone-db-create-qft7p\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.502564 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3d75-account-create-update-6bd9j"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.504528 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.507733 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.513455 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.514060 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3d75-account-create-update-6bd9j"] Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.533343 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7xj6\" (UniqueName: \"kubernetes.io/projected/81b9c6f1-f1c5-4310-9c95-649b730470a5-kube-api-access-g7xj6\") pod \"placement-db-create-pjvhk\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.533426 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c6f1-f1c5-4310-9c95-649b730470a5-operator-scripts\") pod \"placement-db-create-pjvhk\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.533495 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4t7m\" (UniqueName: \"kubernetes.io/projected/481f5fb9-0040-4372-96b7-15e549dab23a-kube-api-access-x4t7m\") pod \"keystone-0a99-account-create-update-8k85v\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.533616 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481f5fb9-0040-4372-96b7-15e549dab23a-operator-scripts\") pod \"keystone-0a99-account-create-update-8k85v\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.536130 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481f5fb9-0040-4372-96b7-15e549dab23a-operator-scripts\") pod \"keystone-0a99-account-create-update-8k85v\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.548742 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4t7m\" (UniqueName: \"kubernetes.io/projected/481f5fb9-0040-4372-96b7-15e549dab23a-kube-api-access-x4t7m\") pod \"keystone-0a99-account-create-update-8k85v\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.617869 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.635045 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c6f1-f1c5-4310-9c95-649b730470a5-operator-scripts\") pod \"placement-db-create-pjvhk\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.635157 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gdq8\" (UniqueName: \"kubernetes.io/projected/0b28b324-7675-41b1-b1af-e37801c55af0-kube-api-access-2gdq8\") pod \"placement-3d75-account-create-update-6bd9j\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.635182 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b28b324-7675-41b1-b1af-e37801c55af0-operator-scripts\") pod \"placement-3d75-account-create-update-6bd9j\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.635246 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7xj6\" (UniqueName: \"kubernetes.io/projected/81b9c6f1-f1c5-4310-9c95-649b730470a5-kube-api-access-g7xj6\") pod \"placement-db-create-pjvhk\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.636104 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c6f1-f1c5-4310-9c95-649b730470a5-operator-scripts\") pod \"placement-db-create-pjvhk\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.652001 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7xj6\" (UniqueName: \"kubernetes.io/projected/81b9c6f1-f1c5-4310-9c95-649b730470a5-kube-api-access-g7xj6\") pod \"placement-db-create-pjvhk\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.714491 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.736540 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gdq8\" (UniqueName: \"kubernetes.io/projected/0b28b324-7675-41b1-b1af-e37801c55af0-kube-api-access-2gdq8\") pod \"placement-3d75-account-create-update-6bd9j\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.736596 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b28b324-7675-41b1-b1af-e37801c55af0-operator-scripts\") pod \"placement-3d75-account-create-update-6bd9j\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.737332 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b28b324-7675-41b1-b1af-e37801c55af0-operator-scripts\") pod \"placement-3d75-account-create-update-6bd9j\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.760398 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gdq8\" (UniqueName: \"kubernetes.io/projected/0b28b324-7675-41b1-b1af-e37801c55af0-kube-api-access-2gdq8\") pod \"placement-3d75-account-create-update-6bd9j\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:03 crc kubenswrapper[4849]: I0320 13:43:03.826642 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.330096 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.706643 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.728960 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.762392 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-operator-scripts\") pod \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.762567 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftkdl\" (UniqueName: \"kubernetes.io/projected/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-kube-api-access-ftkdl\") pod \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\" (UID: \"a8ad0333-bcd8-4886-8d77-6741e82b8f3b\") " Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.776399 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8ad0333-bcd8-4886-8d77-6741e82b8f3b" (UID: "a8ad0333-bcd8-4886-8d77-6741e82b8f3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.781197 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-kube-api-access-ftkdl" (OuterVolumeSpecName: "kube-api-access-ftkdl") pod "a8ad0333-bcd8-4886-8d77-6741e82b8f3b" (UID: "a8ad0333-bcd8-4886-8d77-6741e82b8f3b"). InnerVolumeSpecName "kube-api-access-ftkdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.824988 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rv5md"] Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.825255 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerName="dnsmasq-dns" containerID="cri-o://8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05" gracePeriod=10 Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.865059 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftkdl\" (UniqueName: \"kubernetes.io/projected/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-kube-api-access-ftkdl\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.866049 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8ad0333-bcd8-4886-8d77-6741e82b8f3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.944401 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tdfz5" event={"ID":"a8ad0333-bcd8-4886-8d77-6741e82b8f3b","Type":"ContainerDied","Data":"847f544fbc8e978ed1d1f3dea1b9a425314fe830f42365fbb96944eba36ad4e4"} Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.944443 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tdfz5" Mar 20 13:43:04 crc kubenswrapper[4849]: I0320 13:43:04.944445 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="847f544fbc8e978ed1d1f3dea1b9a425314fe830f42365fbb96944eba36ad4e4" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.122387 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-pjvhk"] Mar 20 13:43:05 crc kubenswrapper[4849]: W0320 13:43:05.126002 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81b9c6f1_f1c5_4310_9c95_649b730470a5.slice/crio-467c284c60ce03eb903272bc6f0cc4104ba691810509606207012853ff0dd37a WatchSource:0}: Error finding container 467c284c60ce03eb903272bc6f0cc4104ba691810509606207012853ff0dd37a: Status 404 returned error can't find the container with id 467c284c60ce03eb903272bc6f0cc4104ba691810509606207012853ff0dd37a Mar 20 13:43:05 crc kubenswrapper[4849]: W0320 13:43:05.128732 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b28b324_7675_41b1_b1af_e37801c55af0.slice/crio-42953762be9117886484445ecdae4fc8349c4055d7b6a7e8d221651e3c3a9e5d WatchSource:0}: Error finding container 42953762be9117886484445ecdae4fc8349c4055d7b6a7e8d221651e3c3a9e5d: Status 404 returned error can't find the container with id 42953762be9117886484445ecdae4fc8349c4055d7b6a7e8d221651e3c3a9e5d Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.135495 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3d75-account-create-update-6bd9j"] Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.170893 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0a99-account-create-update-8k85v"] Mar 20 13:43:05 crc kubenswrapper[4849]: W0320 13:43:05.188618 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481f5fb9_0040_4372_96b7_15e549dab23a.slice/crio-35d587dbe4a097b6c68a359a3e51584e8f40bbe7f8fce34c71e18c6e478a8ba8 WatchSource:0}: Error finding container 35d587dbe4a097b6c68a359a3e51584e8f40bbe7f8fce34c71e18c6e478a8ba8: Status 404 returned error can't find the container with id 35d587dbe4a097b6c68a359a3e51584e8f40bbe7f8fce34c71e18c6e478a8ba8 Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.318922 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qft7p"] Mar 20 13:43:05 crc kubenswrapper[4849]: W0320 13:43:05.338355 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0fd8a46_86a6_403d_b740_ddd048bdc4b0.slice/crio-14fbcaac1f8c7bac65204cdec8e4baf472071798524509b652e79022626c0a71 WatchSource:0}: Error finding container 14fbcaac1f8c7bac65204cdec8e4baf472071798524509b652e79022626c0a71: Status 404 returned error can't find the container with id 14fbcaac1f8c7bac65204cdec8e4baf472071798524509b652e79022626c0a71 Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.820412 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.890097 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7plzk\" (UniqueName: \"kubernetes.io/projected/57669291-1fb9-4564-aa80-25c9cdf20aa0-kube-api-access-7plzk\") pod \"57669291-1fb9-4564-aa80-25c9cdf20aa0\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.890411 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-dns-svc\") pod \"57669291-1fb9-4564-aa80-25c9cdf20aa0\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.890502 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-config\") pod \"57669291-1fb9-4564-aa80-25c9cdf20aa0\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.890629 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-nb\") pod \"57669291-1fb9-4564-aa80-25c9cdf20aa0\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.890703 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-sb\") pod \"57669291-1fb9-4564-aa80-25c9cdf20aa0\" (UID: \"57669291-1fb9-4564-aa80-25c9cdf20aa0\") " Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.899072 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57669291-1fb9-4564-aa80-25c9cdf20aa0-kube-api-access-7plzk" (OuterVolumeSpecName: "kube-api-access-7plzk") pod "57669291-1fb9-4564-aa80-25c9cdf20aa0" (UID: "57669291-1fb9-4564-aa80-25c9cdf20aa0"). InnerVolumeSpecName "kube-api-access-7plzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.931396 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57669291-1fb9-4564-aa80-25c9cdf20aa0" (UID: "57669291-1fb9-4564-aa80-25c9cdf20aa0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.932063 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-config" (OuterVolumeSpecName: "config") pod "57669291-1fb9-4564-aa80-25c9cdf20aa0" (UID: "57669291-1fb9-4564-aa80-25c9cdf20aa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.933746 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57669291-1fb9-4564-aa80-25c9cdf20aa0" (UID: "57669291-1fb9-4564-aa80-25c9cdf20aa0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.947253 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57669291-1fb9-4564-aa80-25c9cdf20aa0" (UID: "57669291-1fb9-4564-aa80-25c9cdf20aa0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.955639 4849 generic.go:334] "Generic (PLEG): container finished" podID="81b9c6f1-f1c5-4310-9c95-649b730470a5" containerID="fb4ffd092b417824dce96e2e08c24ef2c022bf8bb0d70888a05443d8404cdfe1" exitCode=0 Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.955747 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pjvhk" event={"ID":"81b9c6f1-f1c5-4310-9c95-649b730470a5","Type":"ContainerDied","Data":"fb4ffd092b417824dce96e2e08c24ef2c022bf8bb0d70888a05443d8404cdfe1"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.956038 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pjvhk" event={"ID":"81b9c6f1-f1c5-4310-9c95-649b730470a5","Type":"ContainerStarted","Data":"467c284c60ce03eb903272bc6f0cc4104ba691810509606207012853ff0dd37a"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.958474 4849 generic.go:334] "Generic (PLEG): container finished" podID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerID="8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05" exitCode=0 Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.958544 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.958583 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" event={"ID":"57669291-1fb9-4564-aa80-25c9cdf20aa0","Type":"ContainerDied","Data":"8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.958617 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rv5md" event={"ID":"57669291-1fb9-4564-aa80-25c9cdf20aa0","Type":"ContainerDied","Data":"8a5c19ed274eae02c06beec4d57726efe0d954b2baacc25ec0db3d12d7b2a3fe"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.958636 4849 scope.go:117] "RemoveContainer" containerID="8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.971876 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3d75-account-create-update-6bd9j" event={"ID":"0b28b324-7675-41b1-b1af-e37801c55af0","Type":"ContainerStarted","Data":"47c60bc3417b34de6c722ab64bc3d05ee39f4c0262b7596e940a2d97a48a9026"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.971943 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3d75-account-create-update-6bd9j" event={"ID":"0b28b324-7675-41b1-b1af-e37801c55af0","Type":"ContainerStarted","Data":"42953762be9117886484445ecdae4fc8349c4055d7b6a7e8d221651e3c3a9e5d"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.974091 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qft7p" event={"ID":"c0fd8a46-86a6-403d-b740-ddd048bdc4b0","Type":"ContainerStarted","Data":"76061c666572f1825eb1147c473b1cd80697afb3b14af268eb9aa157da8de120"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.974135 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qft7p" event={"ID":"c0fd8a46-86a6-403d-b740-ddd048bdc4b0","Type":"ContainerStarted","Data":"14fbcaac1f8c7bac65204cdec8e4baf472071798524509b652e79022626c0a71"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.976366 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a99-account-create-update-8k85v" event={"ID":"481f5fb9-0040-4372-96b7-15e549dab23a","Type":"ContainerStarted","Data":"140f4f346af43508331dc6f7f986eb57d97ee9144909d9e6a822159124442cfb"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.976414 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a99-account-create-update-8k85v" event={"ID":"481f5fb9-0040-4372-96b7-15e549dab23a","Type":"ContainerStarted","Data":"35d587dbe4a097b6c68a359a3e51584e8f40bbe7f8fce34c71e18c6e478a8ba8"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.979409 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pmrvl" event={"ID":"07ad4563-bfe9-462b-8191-f21c950281df","Type":"ContainerStarted","Data":"de6ab5f36fe63dfdd36651341060812b8837886687aea327c5a6b05ff7de0c6c"} Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.992778 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7plzk\" (UniqueName: \"kubernetes.io/projected/57669291-1fb9-4564-aa80-25c9cdf20aa0-kube-api-access-7plzk\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.992810 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.992833 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.992842 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:05 crc kubenswrapper[4849]: I0320 13:43:05.992853 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57669291-1fb9-4564-aa80-25c9cdf20aa0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.008346 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-qft7p" podStartSLOduration=3.008331159 podStartE2EDuration="3.008331159s" podCreationTimestamp="2026-03-20 13:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:05.988763104 +0000 UTC m=+1135.666486519" watchObservedRunningTime="2026-03-20 13:43:06.008331159 +0000 UTC m=+1135.686054554" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.013068 4849 scope.go:117] "RemoveContainer" containerID="e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.020688 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3d75-account-create-update-6bd9j" podStartSLOduration=3.020667946 podStartE2EDuration="3.020667946s" podCreationTimestamp="2026-03-20 13:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:06.001592254 +0000 UTC m=+1135.679315649" watchObservedRunningTime="2026-03-20 13:43:06.020667946 +0000 UTC m=+1135.698391341" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.043756 4849 scope.go:117] "RemoveContainer" containerID="8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.044478 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-pmrvl" podStartSLOduration=2.3067115080000002 podStartE2EDuration="7.044460686s" podCreationTimestamp="2026-03-20 13:42:59 +0000 UTC" firstStartedPulling="2026-03-20 13:43:00.235199576 +0000 UTC m=+1129.912922981" lastFinishedPulling="2026-03-20 13:43:04.972948774 +0000 UTC m=+1134.650672159" observedRunningTime="2026-03-20 13:43:06.025785436 +0000 UTC m=+1135.703508851" watchObservedRunningTime="2026-03-20 13:43:06.044460686 +0000 UTC m=+1135.722184081" Mar 20 13:43:06 crc kubenswrapper[4849]: E0320 13:43:06.050284 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05\": container with ID starting with 8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05 not found: ID does not exist" containerID="8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.050373 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05"} err="failed to get container status \"8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05\": rpc error: code = NotFound desc = could not find container \"8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05\": container with ID starting with 8cee9484aee1886b41de39170579614e4bf3047b43be6376752d674d829e2e05 not found: ID does not exist" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.050400 4849 scope.go:117] "RemoveContainer" containerID="e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de" Mar 20 13:43:06 crc kubenswrapper[4849]: E0320 13:43:06.050932 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de\": container with ID starting with e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de not found: ID does not exist" containerID="e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.050974 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de"} err="failed to get container status \"e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de\": rpc error: code = NotFound desc = could not find container \"e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de\": container with ID starting with e8e7e81cf0f7cd3263c96836b438d5673e5bf8085de77bb3587a5e1f7be862de not found: ID does not exist" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.052076 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rv5md"] Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.059861 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rv5md"] Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.061702 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0a99-account-create-update-8k85v" podStartSLOduration=3.061692367 podStartE2EDuration="3.061692367s" podCreationTimestamp="2026-03-20 13:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:06.053842312 +0000 UTC m=+1135.731565727" watchObservedRunningTime="2026-03-20 13:43:06.061692367 +0000 UTC m=+1135.739415762" Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.997868 4849 generic.go:334] "Generic (PLEG): container finished" podID="0b28b324-7675-41b1-b1af-e37801c55af0" containerID="47c60bc3417b34de6c722ab64bc3d05ee39f4c0262b7596e940a2d97a48a9026" exitCode=0 Mar 20 13:43:06 crc kubenswrapper[4849]: I0320 13:43:06.998228 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3d75-account-create-update-6bd9j" event={"ID":"0b28b324-7675-41b1-b1af-e37801c55af0","Type":"ContainerDied","Data":"47c60bc3417b34de6c722ab64bc3d05ee39f4c0262b7596e940a2d97a48a9026"} Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.000363 4849 generic.go:334] "Generic (PLEG): container finished" podID="c0fd8a46-86a6-403d-b740-ddd048bdc4b0" containerID="76061c666572f1825eb1147c473b1cd80697afb3b14af268eb9aa157da8de120" exitCode=0 Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.000412 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qft7p" event={"ID":"c0fd8a46-86a6-403d-b740-ddd048bdc4b0","Type":"ContainerDied","Data":"76061c666572f1825eb1147c473b1cd80697afb3b14af268eb9aa157da8de120"} Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.002025 4849 generic.go:334] "Generic (PLEG): container finished" podID="481f5fb9-0040-4372-96b7-15e549dab23a" containerID="140f4f346af43508331dc6f7f986eb57d97ee9144909d9e6a822159124442cfb" exitCode=0 Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.002988 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a99-account-create-update-8k85v" event={"ID":"481f5fb9-0040-4372-96b7-15e549dab23a","Type":"ContainerDied","Data":"140f4f346af43508331dc6f7f986eb57d97ee9144909d9e6a822159124442cfb"} Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.051796 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" path="/var/lib/kubelet/pods/57669291-1fb9-4564-aa80-25c9cdf20aa0/volumes" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.271631 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9d498"] Mar 20 13:43:07 crc kubenswrapper[4849]: E0320 13:43:07.271935 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerName="init" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.271949 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerName="init" Mar 20 13:43:07 crc kubenswrapper[4849]: E0320 13:43:07.271958 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerName="dnsmasq-dns" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.271964 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerName="dnsmasq-dns" Mar 20 13:43:07 crc kubenswrapper[4849]: E0320 13:43:07.271974 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ad0333-bcd8-4886-8d77-6741e82b8f3b" containerName="mariadb-account-create-update" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.271981 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ad0333-bcd8-4886-8d77-6741e82b8f3b" containerName="mariadb-account-create-update" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.272177 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="57669291-1fb9-4564-aa80-25c9cdf20aa0" containerName="dnsmasq-dns" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.272189 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ad0333-bcd8-4886-8d77-6741e82b8f3b" containerName="mariadb-account-create-update" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.272696 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.283888 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9d498"] Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.382894 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fdbf-account-create-update-2xkp5"] Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.384082 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.386128 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.395227 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fdbf-account-create-update-2xkp5"] Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.423721 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69beefe1-45de-469f-a3af-e42a88b38309-operator-scripts\") pod \"glance-db-create-9d498\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.423896 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngc4m\" (UniqueName: \"kubernetes.io/projected/69beefe1-45de-469f-a3af-e42a88b38309-kube-api-access-ngc4m\") pod \"glance-db-create-9d498\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.427170 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.525444 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c6f1-f1c5-4310-9c95-649b730470a5-operator-scripts\") pod \"81b9c6f1-f1c5-4310-9c95-649b730470a5\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.525521 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7xj6\" (UniqueName: \"kubernetes.io/projected/81b9c6f1-f1c5-4310-9c95-649b730470a5-kube-api-access-g7xj6\") pod \"81b9c6f1-f1c5-4310-9c95-649b730470a5\" (UID: \"81b9c6f1-f1c5-4310-9c95-649b730470a5\") " Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.525854 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74xxr\" (UniqueName: \"kubernetes.io/projected/b0a03006-5384-4542-8b30-dc8bea37c96a-kube-api-access-74xxr\") pod \"glance-fdbf-account-create-update-2xkp5\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.525928 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a03006-5384-4542-8b30-dc8bea37c96a-operator-scripts\") pod \"glance-fdbf-account-create-update-2xkp5\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.525959 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngc4m\" (UniqueName: \"kubernetes.io/projected/69beefe1-45de-469f-a3af-e42a88b38309-kube-api-access-ngc4m\") pod \"glance-db-create-9d498\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.525999 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b9c6f1-f1c5-4310-9c95-649b730470a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81b9c6f1-f1c5-4310-9c95-649b730470a5" (UID: "81b9c6f1-f1c5-4310-9c95-649b730470a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.526123 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69beefe1-45de-469f-a3af-e42a88b38309-operator-scripts\") pod \"glance-db-create-9d498\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.526244 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b9c6f1-f1c5-4310-9c95-649b730470a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.526986 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69beefe1-45de-469f-a3af-e42a88b38309-operator-scripts\") pod \"glance-db-create-9d498\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.530940 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b9c6f1-f1c5-4310-9c95-649b730470a5-kube-api-access-g7xj6" (OuterVolumeSpecName: "kube-api-access-g7xj6") pod "81b9c6f1-f1c5-4310-9c95-649b730470a5" (UID: "81b9c6f1-f1c5-4310-9c95-649b730470a5"). InnerVolumeSpecName "kube-api-access-g7xj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.554213 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngc4m\" (UniqueName: \"kubernetes.io/projected/69beefe1-45de-469f-a3af-e42a88b38309-kube-api-access-ngc4m\") pod \"glance-db-create-9d498\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.592438 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9d498" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.627173 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74xxr\" (UniqueName: \"kubernetes.io/projected/b0a03006-5384-4542-8b30-dc8bea37c96a-kube-api-access-74xxr\") pod \"glance-fdbf-account-create-update-2xkp5\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.627270 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a03006-5384-4542-8b30-dc8bea37c96a-operator-scripts\") pod \"glance-fdbf-account-create-update-2xkp5\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.627388 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7xj6\" (UniqueName: \"kubernetes.io/projected/81b9c6f1-f1c5-4310-9c95-649b730470a5-kube-api-access-g7xj6\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.628381 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a03006-5384-4542-8b30-dc8bea37c96a-operator-scripts\") pod \"glance-fdbf-account-create-update-2xkp5\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.643753 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74xxr\" (UniqueName: \"kubernetes.io/projected/b0a03006-5384-4542-8b30-dc8bea37c96a-kube-api-access-74xxr\") pod \"glance-fdbf-account-create-update-2xkp5\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:07 crc kubenswrapper[4849]: I0320 13:43:07.736476 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:07.999922 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9d498"] Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.011767 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9d498" event={"ID":"69beefe1-45de-469f-a3af-e42a88b38309","Type":"ContainerStarted","Data":"b29c6eca2e2633aa81ef155600b1e2c1bd39d4a67482dd5b3185cfe846cdb8fa"} Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.013842 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-pjvhk" event={"ID":"81b9c6f1-f1c5-4310-9c95-649b730470a5","Type":"ContainerDied","Data":"467c284c60ce03eb903272bc6f0cc4104ba691810509606207012853ff0dd37a"} Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.013897 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="467c284c60ce03eb903272bc6f0cc4104ba691810509606207012853ff0dd37a" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.014023 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-pjvhk" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.275887 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fdbf-account-create-update-2xkp5"] Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.422009 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.458388 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.463759 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.562152 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b28b324-7675-41b1-b1af-e37801c55af0-operator-scripts\") pod \"0b28b324-7675-41b1-b1af-e37801c55af0\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.562225 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrlp2\" (UniqueName: \"kubernetes.io/projected/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-kube-api-access-vrlp2\") pod \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.562271 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481f5fb9-0040-4372-96b7-15e549dab23a-operator-scripts\") pod \"481f5fb9-0040-4372-96b7-15e549dab23a\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.562346 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gdq8\" (UniqueName: \"kubernetes.io/projected/0b28b324-7675-41b1-b1af-e37801c55af0-kube-api-access-2gdq8\") pod \"0b28b324-7675-41b1-b1af-e37801c55af0\" (UID: \"0b28b324-7675-41b1-b1af-e37801c55af0\") " Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.563171 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/481f5fb9-0040-4372-96b7-15e549dab23a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "481f5fb9-0040-4372-96b7-15e549dab23a" (UID: "481f5fb9-0040-4372-96b7-15e549dab23a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.563288 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b28b324-7675-41b1-b1af-e37801c55af0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b28b324-7675-41b1-b1af-e37801c55af0" (UID: "0b28b324-7675-41b1-b1af-e37801c55af0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.563365 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4t7m\" (UniqueName: \"kubernetes.io/projected/481f5fb9-0040-4372-96b7-15e549dab23a-kube-api-access-x4t7m\") pod \"481f5fb9-0040-4372-96b7-15e549dab23a\" (UID: \"481f5fb9-0040-4372-96b7-15e549dab23a\") " Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.564001 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-operator-scripts\") pod \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\" (UID: \"c0fd8a46-86a6-403d-b740-ddd048bdc4b0\") " Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.564481 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/481f5fb9-0040-4372-96b7-15e549dab23a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.565013 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0fd8a46-86a6-403d-b740-ddd048bdc4b0" (UID: "c0fd8a46-86a6-403d-b740-ddd048bdc4b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.567719 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481f5fb9-0040-4372-96b7-15e549dab23a-kube-api-access-x4t7m" (OuterVolumeSpecName: "kube-api-access-x4t7m") pod "481f5fb9-0040-4372-96b7-15e549dab23a" (UID: "481f5fb9-0040-4372-96b7-15e549dab23a"). InnerVolumeSpecName "kube-api-access-x4t7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.567811 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b28b324-7675-41b1-b1af-e37801c55af0-kube-api-access-2gdq8" (OuterVolumeSpecName: "kube-api-access-2gdq8") pod "0b28b324-7675-41b1-b1af-e37801c55af0" (UID: "0b28b324-7675-41b1-b1af-e37801c55af0"). InnerVolumeSpecName "kube-api-access-2gdq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.568024 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-kube-api-access-vrlp2" (OuterVolumeSpecName: "kube-api-access-vrlp2") pod "c0fd8a46-86a6-403d-b740-ddd048bdc4b0" (UID: "c0fd8a46-86a6-403d-b740-ddd048bdc4b0"). InnerVolumeSpecName "kube-api-access-vrlp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.665999 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrlp2\" (UniqueName: \"kubernetes.io/projected/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-kube-api-access-vrlp2\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.666039 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gdq8\" (UniqueName: \"kubernetes.io/projected/0b28b324-7675-41b1-b1af-e37801c55af0-kube-api-access-2gdq8\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.666049 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4t7m\" (UniqueName: \"kubernetes.io/projected/481f5fb9-0040-4372-96b7-15e549dab23a-kube-api-access-x4t7m\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.666058 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0fd8a46-86a6-403d-b740-ddd048bdc4b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:08 crc kubenswrapper[4849]: I0320 13:43:08.666066 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b28b324-7675-41b1-b1af-e37801c55af0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.025328 4849 generic.go:334] "Generic (PLEG): container finished" podID="69beefe1-45de-469f-a3af-e42a88b38309" containerID="ff1e4d29f3ee14059efcbb76cce1d29efadb9d830a66c69dd1051995ea2dbf8d" exitCode=0 Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.025507 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9d498" event={"ID":"69beefe1-45de-469f-a3af-e42a88b38309","Type":"ContainerDied","Data":"ff1e4d29f3ee14059efcbb76cce1d29efadb9d830a66c69dd1051995ea2dbf8d"} Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.027992 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0a99-account-create-update-8k85v" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.027996 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0a99-account-create-update-8k85v" event={"ID":"481f5fb9-0040-4372-96b7-15e549dab23a","Type":"ContainerDied","Data":"35d587dbe4a097b6c68a359a3e51584e8f40bbe7f8fce34c71e18c6e478a8ba8"} Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.028031 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d587dbe4a097b6c68a359a3e51584e8f40bbe7f8fce34c71e18c6e478a8ba8" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.029780 4849 generic.go:334] "Generic (PLEG): container finished" podID="b0a03006-5384-4542-8b30-dc8bea37c96a" containerID="90bd408c86ca7b27de8160bc1ceee85217671e011c05722d93b404ae39bda19f" exitCode=0 Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.029845 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fdbf-account-create-update-2xkp5" event={"ID":"b0a03006-5384-4542-8b30-dc8bea37c96a","Type":"ContainerDied","Data":"90bd408c86ca7b27de8160bc1ceee85217671e011c05722d93b404ae39bda19f"} Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.029863 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fdbf-account-create-update-2xkp5" event={"ID":"b0a03006-5384-4542-8b30-dc8bea37c96a","Type":"ContainerStarted","Data":"7ac04901815040e3d7763e3c28cf133c0e9b3d0b2e724aa26a5eb3817ad7cbb7"} Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.031600 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3d75-account-create-update-6bd9j" event={"ID":"0b28b324-7675-41b1-b1af-e37801c55af0","Type":"ContainerDied","Data":"42953762be9117886484445ecdae4fc8349c4055d7b6a7e8d221651e3c3a9e5d"} Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.031629 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42953762be9117886484445ecdae4fc8349c4055d7b6a7e8d221651e3c3a9e5d" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.031694 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3d75-account-create-update-6bd9j" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.035177 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qft7p" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.082934 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qft7p" event={"ID":"c0fd8a46-86a6-403d-b740-ddd048bdc4b0","Type":"ContainerDied","Data":"14fbcaac1f8c7bac65204cdec8e4baf472071798524509b652e79022626c0a71"} Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.082971 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fbcaac1f8c7bac65204cdec8e4baf472071798524509b652e79022626c0a71" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.135875 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tdfz5"] Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.141964 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tdfz5"] Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.220245 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-n79wf"] Mar 20 13:43:09 crc kubenswrapper[4849]: E0320 13:43:09.220650 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b9c6f1-f1c5-4310-9c95-649b730470a5" containerName="mariadb-database-create" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.220672 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b9c6f1-f1c5-4310-9c95-649b730470a5" containerName="mariadb-database-create" Mar 20 13:43:09 crc kubenswrapper[4849]: E0320 13:43:09.220687 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0fd8a46-86a6-403d-b740-ddd048bdc4b0" containerName="mariadb-database-create" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.220695 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0fd8a46-86a6-403d-b740-ddd048bdc4b0" containerName="mariadb-database-create" Mar 20 13:43:09 crc kubenswrapper[4849]: E0320 13:43:09.220707 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b28b324-7675-41b1-b1af-e37801c55af0" containerName="mariadb-account-create-update" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.220716 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b28b324-7675-41b1-b1af-e37801c55af0" containerName="mariadb-account-create-update" Mar 20 13:43:09 crc kubenswrapper[4849]: E0320 13:43:09.220727 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481f5fb9-0040-4372-96b7-15e549dab23a" containerName="mariadb-account-create-update" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.220735 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="481f5fb9-0040-4372-96b7-15e549dab23a" containerName="mariadb-account-create-update" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.220995 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="481f5fb9-0040-4372-96b7-15e549dab23a" containerName="mariadb-account-create-update" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.221024 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b28b324-7675-41b1-b1af-e37801c55af0" containerName="mariadb-account-create-update" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.221042 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b9c6f1-f1c5-4310-9c95-649b730470a5" containerName="mariadb-database-create" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.221056 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0fd8a46-86a6-403d-b740-ddd048bdc4b0" containerName="mariadb-database-create" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.221703 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.224892 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.234098 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n79wf"] Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.378571 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6thh\" (UniqueName: \"kubernetes.io/projected/1912edcd-7266-4942-a039-5179f6b98661-kube-api-access-f6thh\") pod \"root-account-create-update-n79wf\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.378656 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1912edcd-7266-4942-a039-5179f6b98661-operator-scripts\") pod \"root-account-create-update-n79wf\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.480398 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6thh\" (UniqueName: \"kubernetes.io/projected/1912edcd-7266-4942-a039-5179f6b98661-kube-api-access-f6thh\") pod \"root-account-create-update-n79wf\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.480497 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1912edcd-7266-4942-a039-5179f6b98661-operator-scripts\") pod \"root-account-create-update-n79wf\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.482352 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1912edcd-7266-4942-a039-5179f6b98661-operator-scripts\") pod \"root-account-create-update-n79wf\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.514584 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6thh\" (UniqueName: \"kubernetes.io/projected/1912edcd-7266-4942-a039-5179f6b98661-kube-api-access-f6thh\") pod \"root-account-create-update-n79wf\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.535119 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:09 crc kubenswrapper[4849]: I0320 13:43:09.536419 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.005630 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n79wf"] Mar 20 13:43:10 crc kubenswrapper[4849]: W0320 13:43:10.023316 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1912edcd_7266_4942_a039_5179f6b98661.slice/crio-4e5b5793a908119ea9baf31cfbb5097f0b9bd27cb4287fb613c363ac0cbe9ec1 WatchSource:0}: Error finding container 4e5b5793a908119ea9baf31cfbb5097f0b9bd27cb4287fb613c363ac0cbe9ec1: Status 404 returned error can't find the container with id 4e5b5793a908119ea9baf31cfbb5097f0b9bd27cb4287fb613c363ac0cbe9ec1 Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.051644 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n79wf" event={"ID":"1912edcd-7266-4942-a039-5179f6b98661","Type":"ContainerStarted","Data":"4e5b5793a908119ea9baf31cfbb5097f0b9bd27cb4287fb613c363ac0cbe9ec1"} Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.459625 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9d498" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.464850 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.601592 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69beefe1-45de-469f-a3af-e42a88b38309-operator-scripts\") pod \"69beefe1-45de-469f-a3af-e42a88b38309\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.601753 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74xxr\" (UniqueName: \"kubernetes.io/projected/b0a03006-5384-4542-8b30-dc8bea37c96a-kube-api-access-74xxr\") pod \"b0a03006-5384-4542-8b30-dc8bea37c96a\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.601888 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngc4m\" (UniqueName: \"kubernetes.io/projected/69beefe1-45de-469f-a3af-e42a88b38309-kube-api-access-ngc4m\") pod \"69beefe1-45de-469f-a3af-e42a88b38309\" (UID: \"69beefe1-45de-469f-a3af-e42a88b38309\") " Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.601936 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a03006-5384-4542-8b30-dc8bea37c96a-operator-scripts\") pod \"b0a03006-5384-4542-8b30-dc8bea37c96a\" (UID: \"b0a03006-5384-4542-8b30-dc8bea37c96a\") " Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.602340 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69beefe1-45de-469f-a3af-e42a88b38309-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69beefe1-45de-469f-a3af-e42a88b38309" (UID: "69beefe1-45de-469f-a3af-e42a88b38309"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.602628 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a03006-5384-4542-8b30-dc8bea37c96a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0a03006-5384-4542-8b30-dc8bea37c96a" (UID: "b0a03006-5384-4542-8b30-dc8bea37c96a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.607974 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69beefe1-45de-469f-a3af-e42a88b38309-kube-api-access-ngc4m" (OuterVolumeSpecName: "kube-api-access-ngc4m") pod "69beefe1-45de-469f-a3af-e42a88b38309" (UID: "69beefe1-45de-469f-a3af-e42a88b38309"). InnerVolumeSpecName "kube-api-access-ngc4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.608120 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a03006-5384-4542-8b30-dc8bea37c96a-kube-api-access-74xxr" (OuterVolumeSpecName: "kube-api-access-74xxr") pod "b0a03006-5384-4542-8b30-dc8bea37c96a" (UID: "b0a03006-5384-4542-8b30-dc8bea37c96a"). InnerVolumeSpecName "kube-api-access-74xxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.703614 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngc4m\" (UniqueName: \"kubernetes.io/projected/69beefe1-45de-469f-a3af-e42a88b38309-kube-api-access-ngc4m\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.703916 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a03006-5384-4542-8b30-dc8bea37c96a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.703925 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69beefe1-45de-469f-a3af-e42a88b38309-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:10 crc kubenswrapper[4849]: I0320 13:43:10.703933 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74xxr\" (UniqueName: \"kubernetes.io/projected/b0a03006-5384-4542-8b30-dc8bea37c96a-kube-api-access-74xxr\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.056476 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ad0333-bcd8-4886-8d77-6741e82b8f3b" path="/var/lib/kubelet/pods/a8ad0333-bcd8-4886-8d77-6741e82b8f3b/volumes" Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.061441 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fdbf-account-create-update-2xkp5" event={"ID":"b0a03006-5384-4542-8b30-dc8bea37c96a","Type":"ContainerDied","Data":"7ac04901815040e3d7763e3c28cf133c0e9b3d0b2e724aa26a5eb3817ad7cbb7"} Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.061478 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fdbf-account-create-update-2xkp5" Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.061501 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac04901815040e3d7763e3c28cf133c0e9b3d0b2e724aa26a5eb3817ad7cbb7" Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.072480 4849 generic.go:334] "Generic (PLEG): container finished" podID="1912edcd-7266-4942-a039-5179f6b98661" containerID="366ba86dbcac033ff4daf8784d2cc247035000df23e6aff3f237b22d6ac1b19d" exitCode=0 Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.074030 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n79wf" event={"ID":"1912edcd-7266-4942-a039-5179f6b98661","Type":"ContainerDied","Data":"366ba86dbcac033ff4daf8784d2cc247035000df23e6aff3f237b22d6ac1b19d"} Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.074452 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9d498" Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.076923 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9d498" event={"ID":"69beefe1-45de-469f-a3af-e42a88b38309","Type":"ContainerDied","Data":"b29c6eca2e2633aa81ef155600b1e2c1bd39d4a67482dd5b3185cfe846cdb8fa"} Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.076999 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29c6eca2e2633aa81ef155600b1e2c1bd39d4a67482dd5b3185cfe846cdb8fa" Mar 20 13:43:11 crc kubenswrapper[4849]: I0320 13:43:11.342077 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:43:11 crc kubenswrapper[4849]: E0320 13:43:11.342399 4849 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:11 crc kubenswrapper[4849]: E0320 13:43:11.342471 4849 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:11 crc kubenswrapper[4849]: E0320 13:43:11.342580 4849 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift podName:189da4ab-90d9-4761-b94e-77f30a025385 nodeName:}" failed. No retries permitted until 2026-03-20 13:43:27.342548356 +0000 UTC m=+1157.020271781 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift") pod "swift-storage-0" (UID: "189da4ab-90d9-4761-b94e-77f30a025385") : configmap "swift-ring-files" not found Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.083458 4849 generic.go:334] "Generic (PLEG): container finished" podID="07ad4563-bfe9-462b-8191-f21c950281df" containerID="de6ab5f36fe63dfdd36651341060812b8837886687aea327c5a6b05ff7de0c6c" exitCode=0 Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.083562 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pmrvl" event={"ID":"07ad4563-bfe9-462b-8191-f21c950281df","Type":"ContainerDied","Data":"de6ab5f36fe63dfdd36651341060812b8837886687aea327c5a6b05ff7de0c6c"} Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.532141 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.567002 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1912edcd-7266-4942-a039-5179f6b98661-operator-scripts\") pod \"1912edcd-7266-4942-a039-5179f6b98661\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.567083 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6thh\" (UniqueName: \"kubernetes.io/projected/1912edcd-7266-4942-a039-5179f6b98661-kube-api-access-f6thh\") pod \"1912edcd-7266-4942-a039-5179f6b98661\" (UID: \"1912edcd-7266-4942-a039-5179f6b98661\") " Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.567689 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1912edcd-7266-4942-a039-5179f6b98661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1912edcd-7266-4942-a039-5179f6b98661" (UID: "1912edcd-7266-4942-a039-5179f6b98661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.573712 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1912edcd-7266-4942-a039-5179f6b98661-kube-api-access-f6thh" (OuterVolumeSpecName: "kube-api-access-f6thh") pod "1912edcd-7266-4942-a039-5179f6b98661" (UID: "1912edcd-7266-4942-a039-5179f6b98661"). InnerVolumeSpecName "kube-api-access-f6thh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.592643 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9znfj" podUID="f589037a-06aa-452d-82ef-0dbf2177b7fc" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:43:12 crc kubenswrapper[4849]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:43:12 crc kubenswrapper[4849]: > Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.622515 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.624661 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-66s8p"] Mar 20 13:43:12 crc kubenswrapper[4849]: E0320 13:43:12.625078 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69beefe1-45de-469f-a3af-e42a88b38309" containerName="mariadb-database-create" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.625102 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="69beefe1-45de-469f-a3af-e42a88b38309" containerName="mariadb-database-create" Mar 20 13:43:12 crc kubenswrapper[4849]: E0320 13:43:12.625123 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1912edcd-7266-4942-a039-5179f6b98661" containerName="mariadb-account-create-update" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.625132 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1912edcd-7266-4942-a039-5179f6b98661" containerName="mariadb-account-create-update" Mar 20 13:43:12 crc kubenswrapper[4849]: E0320 13:43:12.625165 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a03006-5384-4542-8b30-dc8bea37c96a" containerName="mariadb-account-create-update" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.625174 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a03006-5384-4542-8b30-dc8bea37c96a" containerName="mariadb-account-create-update" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.630736 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="69beefe1-45de-469f-a3af-e42a88b38309" containerName="mariadb-database-create" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.630794 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="1912edcd-7266-4942-a039-5179f6b98661" containerName="mariadb-account-create-update" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.630854 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a03006-5384-4542-8b30-dc8bea37c96a" containerName="mariadb-account-create-update" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.631364 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.633001 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z4p48" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.633130 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.635549 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-66s8p"] Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.642915 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-226bs" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.671294 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6thh\" (UniqueName: \"kubernetes.io/projected/1912edcd-7266-4942-a039-5179f6b98661-kube-api-access-f6thh\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.671330 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1912edcd-7266-4942-a039-5179f6b98661-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.772798 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc9gq\" (UniqueName: \"kubernetes.io/projected/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-kube-api-access-mc9gq\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.773209 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-config-data\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.773229 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-db-sync-config-data\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.773254 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-combined-ca-bundle\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.859841 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9znfj-config-thzt9"] Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.860972 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.862971 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.874495 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-combined-ca-bundle\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.874675 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc9gq\" (UniqueName: \"kubernetes.io/projected/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-kube-api-access-mc9gq\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.874701 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-config-data\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.874721 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-db-sync-config-data\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.884204 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9znfj-config-thzt9"] Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.886491 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-config-data\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.886937 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-db-sync-config-data\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.887427 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-combined-ca-bundle\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.908922 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc9gq\" (UniqueName: \"kubernetes.io/projected/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-kube-api-access-mc9gq\") pod \"glance-db-sync-66s8p\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.947357 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.975944 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run-ovn\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.975983 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-additional-scripts\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.976019 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-log-ovn\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.976093 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hr4g\" (UniqueName: \"kubernetes.io/projected/c7ee7080-712f-45db-a699-49685b6d95aa-kube-api-access-7hr4g\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.976405 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:12 crc kubenswrapper[4849]: I0320 13:43:12.976542 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-scripts\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.077947 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-additional-scripts\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.078307 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run-ovn\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.078332 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-log-ovn\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.078391 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hr4g\" (UniqueName: \"kubernetes.io/projected/c7ee7080-712f-45db-a699-49685b6d95aa-kube-api-access-7hr4g\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.078412 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.078452 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-scripts\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.079092 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-additional-scripts\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.079117 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.079105 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-log-ovn\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.079121 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run-ovn\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.089417 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-scripts\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.101015 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hr4g\" (UniqueName: \"kubernetes.io/projected/c7ee7080-712f-45db-a699-49685b6d95aa-kube-api-access-7hr4g\") pod \"ovn-controller-9znfj-config-thzt9\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.102849 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n79wf" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.103353 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n79wf" event={"ID":"1912edcd-7266-4942-a039-5179f6b98661","Type":"ContainerDied","Data":"4e5b5793a908119ea9baf31cfbb5097f0b9bd27cb4287fb613c363ac0cbe9ec1"} Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.103421 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e5b5793a908119ea9baf31cfbb5097f0b9bd27cb4287fb613c363ac0cbe9ec1" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.281884 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.323145 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-66s8p"] Mar 20 13:43:13 crc kubenswrapper[4849]: W0320 13:43:13.354047 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4baaa4a5_7434_40f0_bfee_185b7fc4fafb.slice/crio-cbbd5e87d17ba3419fa8dafbfb00c4ce632c9fead87ee372300140fed80462b1 WatchSource:0}: Error finding container cbbd5e87d17ba3419fa8dafbfb00c4ce632c9fead87ee372300140fed80462b1: Status 404 returned error can't find the container with id cbbd5e87d17ba3419fa8dafbfb00c4ce632c9fead87ee372300140fed80462b1 Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.557501 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.639009 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9znfj-config-thzt9"] Mar 20 13:43:13 crc kubenswrapper[4849]: W0320 13:43:13.642423 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ee7080_712f_45db_a699_49685b6d95aa.slice/crio-c225c192de9f1afc5fd503043015aae5b0111f275f19948e9fe0868aa1c3bf8b WatchSource:0}: Error finding container c225c192de9f1afc5fd503043015aae5b0111f275f19948e9fe0868aa1c3bf8b: Status 404 returned error can't find the container with id c225c192de9f1afc5fd503043015aae5b0111f275f19948e9fe0868aa1c3bf8b Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.688905 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-dispersionconf\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.688982 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-ring-data-devices\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.689018 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-combined-ca-bundle\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.689054 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-swiftconf\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.689141 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfhkv\" (UniqueName: \"kubernetes.io/projected/07ad4563-bfe9-462b-8191-f21c950281df-kube-api-access-dfhkv\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.689188 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ad4563-bfe9-462b-8191-f21c950281df-etc-swift\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.689219 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-scripts\") pod \"07ad4563-bfe9-462b-8191-f21c950281df\" (UID: \"07ad4563-bfe9-462b-8191-f21c950281df\") " Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.690903 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.692322 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07ad4563-bfe9-462b-8191-f21c950281df-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.694065 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ad4563-bfe9-462b-8191-f21c950281df-kube-api-access-dfhkv" (OuterVolumeSpecName: "kube-api-access-dfhkv") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "kube-api-access-dfhkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.698975 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.712134 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.712164 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-scripts" (OuterVolumeSpecName: "scripts") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.717316 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07ad4563-bfe9-462b-8191-f21c950281df" (UID: "07ad4563-bfe9-462b-8191-f21c950281df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791632 4849 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791666 4849 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791676 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791688 4849 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/07ad4563-bfe9-462b-8191-f21c950281df-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791698 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfhkv\" (UniqueName: \"kubernetes.io/projected/07ad4563-bfe9-462b-8191-f21c950281df-kube-api-access-dfhkv\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791709 4849 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/07ad4563-bfe9-462b-8191-f21c950281df-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:13 crc kubenswrapper[4849]: I0320 13:43:13.791717 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07ad4563-bfe9-462b-8191-f21c950281df-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.112656 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9znfj-config-thzt9" event={"ID":"c7ee7080-712f-45db-a699-49685b6d95aa","Type":"ContainerStarted","Data":"628e9a3aec8dbfa2c7fee28dd556fdcd30e193a34558b1ab9cab5c176af2dfb0"} Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.112997 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9znfj-config-thzt9" event={"ID":"c7ee7080-712f-45db-a699-49685b6d95aa","Type":"ContainerStarted","Data":"c225c192de9f1afc5fd503043015aae5b0111f275f19948e9fe0868aa1c3bf8b"} Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.117030 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-pmrvl" event={"ID":"07ad4563-bfe9-462b-8191-f21c950281df","Type":"ContainerDied","Data":"a92503785b2bfc91fcf1f7cd7efa309483058c893160c2d0544d539dc2b8ec3a"} Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.117068 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92503785b2bfc91fcf1f7cd7efa309483058c893160c2d0544d539dc2b8ec3a" Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.117125 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-pmrvl" Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.121999 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-66s8p" event={"ID":"4baaa4a5-7434-40f0-bfee-185b7fc4fafb","Type":"ContainerStarted","Data":"cbbd5e87d17ba3419fa8dafbfb00c4ce632c9fead87ee372300140fed80462b1"} Mar 20 13:43:14 crc kubenswrapper[4849]: I0320 13:43:14.147987 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9znfj-config-thzt9" podStartSLOduration=2.147969939 podStartE2EDuration="2.147969939s" podCreationTimestamp="2026-03-20 13:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:14.139328333 +0000 UTC m=+1143.817051748" watchObservedRunningTime="2026-03-20 13:43:14.147969939 +0000 UTC m=+1143.825693334" Mar 20 13:43:15 crc kubenswrapper[4849]: I0320 13:43:15.153150 4849 generic.go:334] "Generic (PLEG): container finished" podID="c7ee7080-712f-45db-a699-49685b6d95aa" containerID="628e9a3aec8dbfa2c7fee28dd556fdcd30e193a34558b1ab9cab5c176af2dfb0" exitCode=0 Mar 20 13:43:15 crc kubenswrapper[4849]: I0320 13:43:15.153381 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9znfj-config-thzt9" event={"ID":"c7ee7080-712f-45db-a699-49685b6d95aa","Type":"ContainerDied","Data":"628e9a3aec8dbfa2c7fee28dd556fdcd30e193a34558b1ab9cab5c176af2dfb0"} Mar 20 13:43:15 crc kubenswrapper[4849]: I0320 13:43:15.577419 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-n79wf"] Mar 20 13:43:15 crc kubenswrapper[4849]: I0320 13:43:15.583582 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-n79wf"] Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.520348 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641484 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run\") pod \"c7ee7080-712f-45db-a699-49685b6d95aa\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641558 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hr4g\" (UniqueName: \"kubernetes.io/projected/c7ee7080-712f-45db-a699-49685b6d95aa-kube-api-access-7hr4g\") pod \"c7ee7080-712f-45db-a699-49685b6d95aa\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641627 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-scripts\") pod \"c7ee7080-712f-45db-a699-49685b6d95aa\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641652 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-log-ovn\") pod \"c7ee7080-712f-45db-a699-49685b6d95aa\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641687 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run-ovn\") pod \"c7ee7080-712f-45db-a699-49685b6d95aa\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641707 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-additional-scripts\") pod \"c7ee7080-712f-45db-a699-49685b6d95aa\" (UID: \"c7ee7080-712f-45db-a699-49685b6d95aa\") " Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641941 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run" (OuterVolumeSpecName: "var-run") pod "c7ee7080-712f-45db-a699-49685b6d95aa" (UID: "c7ee7080-712f-45db-a699-49685b6d95aa"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.641983 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c7ee7080-712f-45db-a699-49685b6d95aa" (UID: "c7ee7080-712f-45db-a699-49685b6d95aa"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.642048 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c7ee7080-712f-45db-a699-49685b6d95aa" (UID: "c7ee7080-712f-45db-a699-49685b6d95aa"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.642140 4849 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.642155 4849 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.642635 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c7ee7080-712f-45db-a699-49685b6d95aa" (UID: "c7ee7080-712f-45db-a699-49685b6d95aa"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.642885 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-scripts" (OuterVolumeSpecName: "scripts") pod "c7ee7080-712f-45db-a699-49685b6d95aa" (UID: "c7ee7080-712f-45db-a699-49685b6d95aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.651114 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ee7080-712f-45db-a699-49685b6d95aa-kube-api-access-7hr4g" (OuterVolumeSpecName: "kube-api-access-7hr4g") pod "c7ee7080-712f-45db-a699-49685b6d95aa" (UID: "c7ee7080-712f-45db-a699-49685b6d95aa"). InnerVolumeSpecName "kube-api-access-7hr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.743012 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.743047 4849 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c7ee7080-712f-45db-a699-49685b6d95aa-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.743060 4849 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ee7080-712f-45db-a699-49685b6d95aa-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:16 crc kubenswrapper[4849]: I0320 13:43:16.743074 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hr4g\" (UniqueName: \"kubernetes.io/projected/c7ee7080-712f-45db-a699-49685b6d95aa-kube-api-access-7hr4g\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.054404 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1912edcd-7266-4942-a039-5179f6b98661" path="/var/lib/kubelet/pods/1912edcd-7266-4942-a039-5179f6b98661/volumes" Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.176997 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9znfj-config-thzt9" event={"ID":"c7ee7080-712f-45db-a699-49685b6d95aa","Type":"ContainerDied","Data":"c225c192de9f1afc5fd503043015aae5b0111f275f19948e9fe0868aa1c3bf8b"} Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.177038 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c225c192de9f1afc5fd503043015aae5b0111f275f19948e9fe0868aa1c3bf8b" Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.177071 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9znfj-config-thzt9" Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.241581 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9znfj-config-thzt9"] Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.248469 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9znfj-config-thzt9"] Mar 20 13:43:17 crc kubenswrapper[4849]: I0320 13:43:17.588840 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9znfj" Mar 20 13:43:19 crc kubenswrapper[4849]: I0320 13:43:19.048496 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ee7080-712f-45db-a699-49685b6d95aa" path="/var/lib/kubelet/pods/c7ee7080-712f-45db-a699-49685b6d95aa/volumes" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.595104 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k72fr"] Mar 20 13:43:20 crc kubenswrapper[4849]: E0320 13:43:20.595610 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ee7080-712f-45db-a699-49685b6d95aa" containerName="ovn-config" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.597376 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ee7080-712f-45db-a699-49685b6d95aa" containerName="ovn-config" Mar 20 13:43:20 crc kubenswrapper[4849]: E0320 13:43:20.597428 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ad4563-bfe9-462b-8191-f21c950281df" containerName="swift-ring-rebalance" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.597435 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ad4563-bfe9-462b-8191-f21c950281df" containerName="swift-ring-rebalance" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.597632 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ad4563-bfe9-462b-8191-f21c950281df" containerName="swift-ring-rebalance" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.597656 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ee7080-712f-45db-a699-49685b6d95aa" containerName="ovn-config" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.598430 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.601477 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.606709 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k72fr"] Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.721032 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef32d779-a195-46ae-9112-3ecdbfe73a1e-operator-scripts\") pod \"root-account-create-update-k72fr\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.721123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc5pv\" (UniqueName: \"kubernetes.io/projected/ef32d779-a195-46ae-9112-3ecdbfe73a1e-kube-api-access-jc5pv\") pod \"root-account-create-update-k72fr\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.822771 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef32d779-a195-46ae-9112-3ecdbfe73a1e-operator-scripts\") pod \"root-account-create-update-k72fr\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.823309 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc5pv\" (UniqueName: \"kubernetes.io/projected/ef32d779-a195-46ae-9112-3ecdbfe73a1e-kube-api-access-jc5pv\") pod \"root-account-create-update-k72fr\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.823890 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef32d779-a195-46ae-9112-3ecdbfe73a1e-operator-scripts\") pod \"root-account-create-update-k72fr\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.842457 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc5pv\" (UniqueName: \"kubernetes.io/projected/ef32d779-a195-46ae-9112-3ecdbfe73a1e-kube-api-access-jc5pv\") pod \"root-account-create-update-k72fr\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:20 crc kubenswrapper[4849]: I0320 13:43:20.926506 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:25 crc kubenswrapper[4849]: I0320 13:43:25.325106 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k72fr"] Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.264819 4849 generic.go:334] "Generic (PLEG): container finished" podID="464306bd-0d8b-40ca-aa64-1ec5a00a527b" containerID="b1dd5e5fd8b797ef0e76485804a0026a2df6033f7edc7ea0a4bb051ae008a98a" exitCode=0 Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.264878 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"464306bd-0d8b-40ca-aa64-1ec5a00a527b","Type":"ContainerDied","Data":"b1dd5e5fd8b797ef0e76485804a0026a2df6033f7edc7ea0a4bb051ae008a98a"} Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.267021 4849 generic.go:334] "Generic (PLEG): container finished" podID="3c3c4952-4c22-4389-834c-969b89fb9e20" containerID="5156b08be662746166584ffb769bec7d11452c92898f57b2a574d5ca7c44f253" exitCode=0 Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.267084 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c3c4952-4c22-4389-834c-969b89fb9e20","Type":"ContainerDied","Data":"5156b08be662746166584ffb769bec7d11452c92898f57b2a574d5ca7c44f253"} Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.268489 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-66s8p" event={"ID":"4baaa4a5-7434-40f0-bfee-185b7fc4fafb","Type":"ContainerStarted","Data":"5cbd99f599b63b50294589943663a22165a805730ac589805dd1bcd01b8a9ce9"} Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.270395 4849 generic.go:334] "Generic (PLEG): container finished" podID="ef32d779-a195-46ae-9112-3ecdbfe73a1e" containerID="1a88a38842acb2f5623a2178083ec136d1bc6ee32eefbf4ffeef8c6c3f024a07" exitCode=0 Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.270450 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k72fr" event={"ID":"ef32d779-a195-46ae-9112-3ecdbfe73a1e","Type":"ContainerDied","Data":"1a88a38842acb2f5623a2178083ec136d1bc6ee32eefbf4ffeef8c6c3f024a07"} Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.270471 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k72fr" event={"ID":"ef32d779-a195-46ae-9112-3ecdbfe73a1e","Type":"ContainerStarted","Data":"bc137be67c9bd8fd7fe6440d4c8fcfc8de1dbd24914f14843bd5d60aa51b379d"} Mar 20 13:43:26 crc kubenswrapper[4849]: I0320 13:43:26.365708 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-66s8p" podStartSLOduration=2.723190769 podStartE2EDuration="14.365680634s" podCreationTimestamp="2026-03-20 13:43:12 +0000 UTC" firstStartedPulling="2026-03-20 13:43:13.357048216 +0000 UTC m=+1143.034771611" lastFinishedPulling="2026-03-20 13:43:24.999538071 +0000 UTC m=+1154.677261476" observedRunningTime="2026-03-20 13:43:26.32931581 +0000 UTC m=+1156.007039225" watchObservedRunningTime="2026-03-20 13:43:26.365680634 +0000 UTC m=+1156.043404029" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.280274 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3c3c4952-4c22-4389-834c-969b89fb9e20","Type":"ContainerStarted","Data":"97169cc302ed38b44595441cc5e2977b1fa25eb6361ab6da315abf549a91a2a1"} Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.281809 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.282905 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"464306bd-0d8b-40ca-aa64-1ec5a00a527b","Type":"ContainerStarted","Data":"ca26ec57ef6fbea64103cd82b8cb007d98a75b9acb6c10ad363bd8b9cc1b64f4"} Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.283197 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.313349 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=47.90794506 podStartE2EDuration="1m20.31333118s" podCreationTimestamp="2026-03-20 13:42:07 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.543939346 +0000 UTC m=+1090.221662741" lastFinishedPulling="2026-03-20 13:42:52.949325456 +0000 UTC m=+1122.627048861" observedRunningTime="2026-03-20 13:43:27.305124656 +0000 UTC m=+1156.982848061" watchObservedRunningTime="2026-03-20 13:43:27.31333118 +0000 UTC m=+1156.991054575" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.331809 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=47.717364423 podStartE2EDuration="1m20.331785045s" podCreationTimestamp="2026-03-20 13:42:07 +0000 UTC" firstStartedPulling="2026-03-20 13:42:20.406016777 +0000 UTC m=+1090.083740182" lastFinishedPulling="2026-03-20 13:42:53.020437419 +0000 UTC m=+1122.698160804" observedRunningTime="2026-03-20 13:43:27.324019203 +0000 UTC m=+1157.001742608" watchObservedRunningTime="2026-03-20 13:43:27.331785045 +0000 UTC m=+1157.009508440" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.428141 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.434368 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/189da4ab-90d9-4761-b94e-77f30a025385-etc-swift\") pod \"swift-storage-0\" (UID: \"189da4ab-90d9-4761-b94e-77f30a025385\") " pod="openstack/swift-storage-0" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.614588 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.664331 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.732437 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc5pv\" (UniqueName: \"kubernetes.io/projected/ef32d779-a195-46ae-9112-3ecdbfe73a1e-kube-api-access-jc5pv\") pod \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.732557 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef32d779-a195-46ae-9112-3ecdbfe73a1e-operator-scripts\") pod \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\" (UID: \"ef32d779-a195-46ae-9112-3ecdbfe73a1e\") " Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.733560 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef32d779-a195-46ae-9112-3ecdbfe73a1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef32d779-a195-46ae-9112-3ecdbfe73a1e" (UID: "ef32d779-a195-46ae-9112-3ecdbfe73a1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.739955 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef32d779-a195-46ae-9112-3ecdbfe73a1e-kube-api-access-jc5pv" (OuterVolumeSpecName: "kube-api-access-jc5pv") pod "ef32d779-a195-46ae-9112-3ecdbfe73a1e" (UID: "ef32d779-a195-46ae-9112-3ecdbfe73a1e"). InnerVolumeSpecName "kube-api-access-jc5pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.834363 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef32d779-a195-46ae-9112-3ecdbfe73a1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:27 crc kubenswrapper[4849]: I0320 13:43:27.834700 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc5pv\" (UniqueName: \"kubernetes.io/projected/ef32d779-a195-46ae-9112-3ecdbfe73a1e-kube-api-access-jc5pv\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:28 crc kubenswrapper[4849]: I0320 13:43:28.282799 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:43:28 crc kubenswrapper[4849]: W0320 13:43:28.291700 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189da4ab_90d9_4761_b94e_77f30a025385.slice/crio-a0ca95b79cf9a961fd83bef1c27129a8025bdc5de561b30d5ca08991344250dc WatchSource:0}: Error finding container a0ca95b79cf9a961fd83bef1c27129a8025bdc5de561b30d5ca08991344250dc: Status 404 returned error can't find the container with id a0ca95b79cf9a961fd83bef1c27129a8025bdc5de561b30d5ca08991344250dc Mar 20 13:43:28 crc kubenswrapper[4849]: I0320 13:43:28.298635 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k72fr" Mar 20 13:43:28 crc kubenswrapper[4849]: I0320 13:43:28.298897 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k72fr" event={"ID":"ef32d779-a195-46ae-9112-3ecdbfe73a1e","Type":"ContainerDied","Data":"bc137be67c9bd8fd7fe6440d4c8fcfc8de1dbd24914f14843bd5d60aa51b379d"} Mar 20 13:43:28 crc kubenswrapper[4849]: I0320 13:43:28.298969 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc137be67c9bd8fd7fe6440d4c8fcfc8de1dbd24914f14843bd5d60aa51b379d" Mar 20 13:43:29 crc kubenswrapper[4849]: I0320 13:43:29.307199 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"a0ca95b79cf9a961fd83bef1c27129a8025bdc5de561b30d5ca08991344250dc"} Mar 20 13:43:30 crc kubenswrapper[4849]: I0320 13:43:30.315186 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"9eaf507c0e2b3c9847ca8487ed65bd3b7a58819b537f03274d1b491e7c9c8769"} Mar 20 13:43:30 crc kubenswrapper[4849]: I0320 13:43:30.315465 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"deb4ab92c0e631b7c259dcf2b995c6b630482434802d4d99625b6d0f381bcfe1"} Mar 20 13:43:30 crc kubenswrapper[4849]: I0320 13:43:30.315475 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"c2905aba5dc1cf04149494b94bef3241f2016139d4a8a99da77ef4b7f710db51"} Mar 20 13:43:30 crc kubenswrapper[4849]: I0320 13:43:30.315483 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"ba0e6736b1b0df5c277c1294d1223ea01f91967f18750db78ba85142284fdb31"} Mar 20 13:43:32 crc kubenswrapper[4849]: I0320 13:43:32.334617 4849 generic.go:334] "Generic (PLEG): container finished" podID="4baaa4a5-7434-40f0-bfee-185b7fc4fafb" containerID="5cbd99f599b63b50294589943663a22165a805730ac589805dd1bcd01b8a9ce9" exitCode=0 Mar 20 13:43:32 crc kubenswrapper[4849]: I0320 13:43:32.334718 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-66s8p" event={"ID":"4baaa4a5-7434-40f0-bfee-185b7fc4fafb","Type":"ContainerDied","Data":"5cbd99f599b63b50294589943663a22165a805730ac589805dd1bcd01b8a9ce9"} Mar 20 13:43:32 crc kubenswrapper[4849]: I0320 13:43:32.338918 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"0b81ea336054e5d691e019a5abceb0102604b6e56b2f1e873a564823b9a958b1"} Mar 20 13:43:32 crc kubenswrapper[4849]: I0320 13:43:32.338961 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"5b2aaaadc16212cfc9a4752904754fbddb0b9fdf48ca667f8302388a3a27d24b"} Mar 20 13:43:32 crc kubenswrapper[4849]: I0320 13:43:32.338973 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"1ea18ffb285b860db41f16937df8ce259877d1eacd171c8a635b6b6bc452146b"} Mar 20 13:43:32 crc kubenswrapper[4849]: I0320 13:43:32.338984 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"d56e80da4000ca37b32f9c2e891680a9988cf0687060d7b5861a8c47fe7c7f3b"} Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.717101 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.845066 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc9gq\" (UniqueName: \"kubernetes.io/projected/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-kube-api-access-mc9gq\") pod \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.845130 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-config-data\") pod \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.845169 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-db-sync-config-data\") pod \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.845208 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-combined-ca-bundle\") pod \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\" (UID: \"4baaa4a5-7434-40f0-bfee-185b7fc4fafb\") " Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.849539 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-kube-api-access-mc9gq" (OuterVolumeSpecName: "kube-api-access-mc9gq") pod "4baaa4a5-7434-40f0-bfee-185b7fc4fafb" (UID: "4baaa4a5-7434-40f0-bfee-185b7fc4fafb"). InnerVolumeSpecName "kube-api-access-mc9gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.849880 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4baaa4a5-7434-40f0-bfee-185b7fc4fafb" (UID: "4baaa4a5-7434-40f0-bfee-185b7fc4fafb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.875388 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4baaa4a5-7434-40f0-bfee-185b7fc4fafb" (UID: "4baaa4a5-7434-40f0-bfee-185b7fc4fafb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.901185 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-config-data" (OuterVolumeSpecName: "config-data") pod "4baaa4a5-7434-40f0-bfee-185b7fc4fafb" (UID: "4baaa4a5-7434-40f0-bfee-185b7fc4fafb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.947364 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc9gq\" (UniqueName: \"kubernetes.io/projected/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-kube-api-access-mc9gq\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.947410 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.947424 4849 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:33 crc kubenswrapper[4849]: I0320 13:43:33.947436 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4baaa4a5-7434-40f0-bfee-185b7fc4fafb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.367954 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"0710677bd0de4ba4f00b8955937056e281036d04312fa7c5956644787d71a46f"} Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.370315 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-66s8p" event={"ID":"4baaa4a5-7434-40f0-bfee-185b7fc4fafb","Type":"ContainerDied","Data":"cbbd5e87d17ba3419fa8dafbfb00c4ce632c9fead87ee372300140fed80462b1"} Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.370362 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbbd5e87d17ba3419fa8dafbfb00c4ce632c9fead87ee372300140fed80462b1" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.370407 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-66s8p" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.772857 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-l6m85"] Mar 20 13:43:34 crc kubenswrapper[4849]: E0320 13:43:34.773402 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef32d779-a195-46ae-9112-3ecdbfe73a1e" containerName="mariadb-account-create-update" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.773414 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef32d779-a195-46ae-9112-3ecdbfe73a1e" containerName="mariadb-account-create-update" Mar 20 13:43:34 crc kubenswrapper[4849]: E0320 13:43:34.773425 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4baaa4a5-7434-40f0-bfee-185b7fc4fafb" containerName="glance-db-sync" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.773431 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="4baaa4a5-7434-40f0-bfee-185b7fc4fafb" containerName="glance-db-sync" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.773589 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="4baaa4a5-7434-40f0-bfee-185b7fc4fafb" containerName="glance-db-sync" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.773600 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef32d779-a195-46ae-9112-3ecdbfe73a1e" containerName="mariadb-account-create-update" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.774411 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.802087 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-l6m85"] Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.863048 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.863094 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.863143 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4wr\" (UniqueName: \"kubernetes.io/projected/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-kube-api-access-hr4wr\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.863178 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.863198 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-config\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.964336 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.964387 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.964445 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr4wr\" (UniqueName: \"kubernetes.io/projected/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-kube-api-access-hr4wr\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.964487 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.964510 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-config\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.965677 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-config\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.966261 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.966836 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.967354 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:34 crc kubenswrapper[4849]: I0320 13:43:34.990184 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr4wr\" (UniqueName: \"kubernetes.io/projected/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-kube-api-access-hr4wr\") pod \"dnsmasq-dns-5b946c75cc-l6m85\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:35 crc kubenswrapper[4849]: I0320 13:43:35.227310 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:35 crc kubenswrapper[4849]: I0320 13:43:35.532469 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"edeff7f7ccf5a978ff623c73c561cbb0264c4640c2c04dc2c2450c9f59c5085b"} Mar 20 13:43:35 crc kubenswrapper[4849]: I0320 13:43:35.532783 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"0d57f0d6b660553b1d77872d63348095ecb502b7297e8d4cdf9f1c4e2c11ef96"} Mar 20 13:43:35 crc kubenswrapper[4849]: I0320 13:43:35.532798 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"39d07e112a6dcedcb0cc4282c983f498aa16fb885aa8d4e29b13955bc6e57485"} Mar 20 13:43:35 crc kubenswrapper[4849]: I0320 13:43:35.532808 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"63bf2a25d60e51fc2e49f489c1dbf29e4e377796aea91b7673cfb322977a42c4"} Mar 20 13:43:35 crc kubenswrapper[4849]: I0320 13:43:35.532836 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"3c4b0e7ee2919c93164fef7ced87bb784b2468c1112b4a47194c32d106277e27"} Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.286072 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-l6m85"] Mar 20 13:43:36 crc kubenswrapper[4849]: W0320 13:43:36.288619 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8ec0c5_c17e_4c48_88ad_5dc7b6d626be.slice/crio-8aafa20f294ef2d9a49f02cbe0125154e92c3d5c56099019499fcd629fa7064d WatchSource:0}: Error finding container 8aafa20f294ef2d9a49f02cbe0125154e92c3d5c56099019499fcd629fa7064d: Status 404 returned error can't find the container with id 8aafa20f294ef2d9a49f02cbe0125154e92c3d5c56099019499fcd629fa7064d Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.540346 4849 generic.go:334] "Generic (PLEG): container finished" podID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerID="ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41" exitCode=0 Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.540408 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" event={"ID":"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be","Type":"ContainerDied","Data":"ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41"} Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.540433 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" event={"ID":"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be","Type":"ContainerStarted","Data":"8aafa20f294ef2d9a49f02cbe0125154e92c3d5c56099019499fcd629fa7064d"} Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.561034 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"189da4ab-90d9-4761-b94e-77f30a025385","Type":"ContainerStarted","Data":"ba8b5bacdeba407e4b73d90113f799584dfc0067a3154cc0539d83bf11235a63"} Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.637873 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.397577439 podStartE2EDuration="42.63784571s" podCreationTimestamp="2026-03-20 13:42:54 +0000 UTC" firstStartedPulling="2026-03-20 13:43:28.295667524 +0000 UTC m=+1157.973390919" lastFinishedPulling="2026-03-20 13:43:33.535935795 +0000 UTC m=+1163.213659190" observedRunningTime="2026-03-20 13:43:36.623367965 +0000 UTC m=+1166.301091380" watchObservedRunningTime="2026-03-20 13:43:36.63784571 +0000 UTC m=+1166.315569105" Mar 20 13:43:36 crc kubenswrapper[4849]: I0320 13:43:36.984765 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-l6m85"] Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.013419 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kh6fq"] Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.016205 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.018260 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.049622 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kh6fq"] Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.105665 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.105724 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-config\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.105794 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvc5b\" (UniqueName: \"kubernetes.io/projected/e17094e8-90c1-4262-abaf-99d824238711-kube-api-access-xvc5b\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.105839 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.105858 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.105882 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.206880 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.206948 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-config\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.207038 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvc5b\" (UniqueName: \"kubernetes.io/projected/e17094e8-90c1-4262-abaf-99d824238711-kube-api-access-xvc5b\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.207072 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.207089 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.207114 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.207715 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.207754 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-config\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.208714 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.209082 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.209082 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.234813 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvc5b\" (UniqueName: \"kubernetes.io/projected/e17094e8-90c1-4262-abaf-99d824238711-kube-api-access-xvc5b\") pod \"dnsmasq-dns-74f6bcbc87-kh6fq\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.344683 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.604026 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" event={"ID":"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be","Type":"ContainerStarted","Data":"3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e"} Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.604354 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.627132 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" podStartSLOduration=3.6271135389999998 podStartE2EDuration="3.627113539s" podCreationTimestamp="2026-03-20 13:43:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:37.621969342 +0000 UTC m=+1167.299692767" watchObservedRunningTime="2026-03-20 13:43:37.627113539 +0000 UTC m=+1167.304836934" Mar 20 13:43:37 crc kubenswrapper[4849]: W0320 13:43:37.782029 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode17094e8_90c1_4262_abaf_99d824238711.slice/crio-0c193d9aa43e6a460a9f1318f61b386afd9288d86e03df73f82d1e7e95137bcb WatchSource:0}: Error finding container 0c193d9aa43e6a460a9f1318f61b386afd9288d86e03df73f82d1e7e95137bcb: Status 404 returned error can't find the container with id 0c193d9aa43e6a460a9f1318f61b386afd9288d86e03df73f82d1e7e95137bcb Mar 20 13:43:37 crc kubenswrapper[4849]: I0320 13:43:37.783304 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kh6fq"] Mar 20 13:43:38 crc kubenswrapper[4849]: I0320 13:43:38.612172 4849 generic.go:334] "Generic (PLEG): container finished" podID="e17094e8-90c1-4262-abaf-99d824238711" containerID="72d41a411d14a7db0d177f21a3ddd0abc333316cd838a70a608f93d8dd790124" exitCode=0 Mar 20 13:43:38 crc kubenswrapper[4849]: I0320 13:43:38.612577 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerName="dnsmasq-dns" containerID="cri-o://3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e" gracePeriod=10 Mar 20 13:43:38 crc kubenswrapper[4849]: I0320 13:43:38.613379 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" event={"ID":"e17094e8-90c1-4262-abaf-99d824238711","Type":"ContainerDied","Data":"72d41a411d14a7db0d177f21a3ddd0abc333316cd838a70a608f93d8dd790124"} Mar 20 13:43:38 crc kubenswrapper[4849]: I0320 13:43:38.613406 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" event={"ID":"e17094e8-90c1-4262-abaf-99d824238711","Type":"ContainerStarted","Data":"0c193d9aa43e6a460a9f1318f61b386afd9288d86e03df73f82d1e7e95137bcb"} Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.043842 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.143056 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-config\") pod \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.143125 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-nb\") pod \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.143154 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-sb\") pod \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.143282 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr4wr\" (UniqueName: \"kubernetes.io/projected/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-kube-api-access-hr4wr\") pod \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.143321 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-dns-svc\") pod \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\" (UID: \"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be\") " Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.156051 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-kube-api-access-hr4wr" (OuterVolumeSpecName: "kube-api-access-hr4wr") pod "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" (UID: "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be"). InnerVolumeSpecName "kube-api-access-hr4wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.171559 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="464306bd-0d8b-40ca-aa64-1ec5a00a527b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.187754 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" (UID: "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.188379 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" (UID: "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.201680 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-config" (OuterVolumeSpecName: "config") pod "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" (UID: "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.203552 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" (UID: "6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.216538 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="3c3c4952-4c22-4389-834c-969b89fb9e20" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.244905 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.244951 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.244967 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr4wr\" (UniqueName: \"kubernetes.io/projected/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-kube-api-access-hr4wr\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.244980 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.244988 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.384619 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.384688 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.639172 4849 generic.go:334] "Generic (PLEG): container finished" podID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerID="3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e" exitCode=0 Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.639265 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.639286 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" event={"ID":"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be","Type":"ContainerDied","Data":"3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e"} Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.639706 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-l6m85" event={"ID":"6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be","Type":"ContainerDied","Data":"8aafa20f294ef2d9a49f02cbe0125154e92c3d5c56099019499fcd629fa7064d"} Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.639734 4849 scope.go:117] "RemoveContainer" containerID="3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.650864 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" event={"ID":"e17094e8-90c1-4262-abaf-99d824238711","Type":"ContainerStarted","Data":"5343facdadaca1fcaa74c67f5e2a10a1588517a0bb5bf386aefc4fc9c867833b"} Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.652768 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.678300 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" podStartSLOduration=3.678276661 podStartE2EDuration="3.678276661s" podCreationTimestamp="2026-03-20 13:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:39.668432731 +0000 UTC m=+1169.346156146" watchObservedRunningTime="2026-03-20 13:43:39.678276661 +0000 UTC m=+1169.356000056" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.688325 4849 scope.go:117] "RemoveContainer" containerID="ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.703053 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-l6m85"] Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.707734 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-l6m85"] Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.778654 4849 scope.go:117] "RemoveContainer" containerID="3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e" Mar 20 13:43:39 crc kubenswrapper[4849]: E0320 13:43:39.779258 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e\": container with ID starting with 3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e not found: ID does not exist" containerID="3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.779308 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e"} err="failed to get container status \"3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e\": rpc error: code = NotFound desc = could not find container \"3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e\": container with ID starting with 3ee0ec8c092468b8145b6d650bbaafeac78c08b061eca0e6cfe0a5a4e241fc7e not found: ID does not exist" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.779336 4849 scope.go:117] "RemoveContainer" containerID="ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41" Mar 20 13:43:39 crc kubenswrapper[4849]: E0320 13:43:39.779948 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41\": container with ID starting with ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41 not found: ID does not exist" containerID="ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41" Mar 20 13:43:39 crc kubenswrapper[4849]: I0320 13:43:39.780020 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41"} err="failed to get container status \"ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41\": rpc error: code = NotFound desc = could not find container \"ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41\": container with ID starting with ef6c06f4cbfc5ae88e15524d548516b275dbd738817bc5ddb284026e3fcbad41 not found: ID does not exist" Mar 20 13:43:41 crc kubenswrapper[4849]: I0320 13:43:41.046490 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" path="/var/lib/kubelet/pods/6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be/volumes" Mar 20 13:43:41 crc kubenswrapper[4849]: I0320 13:43:41.098243 4849 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb0a03006-5384-4542-8b30-dc8bea37c96a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb0a03006-5384-4542-8b30-dc8bea37c96a] : Timed out while waiting for systemd to remove kubepods-besteffort-podb0a03006_5384_4542_8b30_dc8bea37c96a.slice" Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.346873 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.410486 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhdr6"] Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.410787 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-qhdr6" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerName="dnsmasq-dns" containerID="cri-o://6491f273c292616d8b86ff0b8c53113303e592f9a04760b6e490dca0bb172dd7" gracePeriod=10 Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.727402 4849 generic.go:334] "Generic (PLEG): container finished" podID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerID="6491f273c292616d8b86ff0b8c53113303e592f9a04760b6e490dca0bb172dd7" exitCode=0 Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.727761 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhdr6" event={"ID":"959c8c00-cbce-41a1-8b5b-85d5885bda82","Type":"ContainerDied","Data":"6491f273c292616d8b86ff0b8c53113303e592f9a04760b6e490dca0bb172dd7"} Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.884908 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.946729 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-nb\") pod \"959c8c00-cbce-41a1-8b5b-85d5885bda82\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.946810 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-dns-svc\") pod \"959c8c00-cbce-41a1-8b5b-85d5885bda82\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.995257 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "959c8c00-cbce-41a1-8b5b-85d5885bda82" (UID: "959c8c00-cbce-41a1-8b5b-85d5885bda82"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:47 crc kubenswrapper[4849]: I0320 13:43:47.999711 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "959c8c00-cbce-41a1-8b5b-85d5885bda82" (UID: "959c8c00-cbce-41a1-8b5b-85d5885bda82"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.048019 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-sb\") pod \"959c8c00-cbce-41a1-8b5b-85d5885bda82\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.048100 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp2sr\" (UniqueName: \"kubernetes.io/projected/959c8c00-cbce-41a1-8b5b-85d5885bda82-kube-api-access-rp2sr\") pod \"959c8c00-cbce-41a1-8b5b-85d5885bda82\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.048144 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-config\") pod \"959c8c00-cbce-41a1-8b5b-85d5885bda82\" (UID: \"959c8c00-cbce-41a1-8b5b-85d5885bda82\") " Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.048558 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.048579 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.050939 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959c8c00-cbce-41a1-8b5b-85d5885bda82-kube-api-access-rp2sr" (OuterVolumeSpecName: "kube-api-access-rp2sr") pod "959c8c00-cbce-41a1-8b5b-85d5885bda82" (UID: "959c8c00-cbce-41a1-8b5b-85d5885bda82"). InnerVolumeSpecName "kube-api-access-rp2sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.080707 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-config" (OuterVolumeSpecName: "config") pod "959c8c00-cbce-41a1-8b5b-85d5885bda82" (UID: "959c8c00-cbce-41a1-8b5b-85d5885bda82"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.082564 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "959c8c00-cbce-41a1-8b5b-85d5885bda82" (UID: "959c8c00-cbce-41a1-8b5b-85d5885bda82"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.150023 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.150069 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp2sr\" (UniqueName: \"kubernetes.io/projected/959c8c00-cbce-41a1-8b5b-85d5885bda82-kube-api-access-rp2sr\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.150084 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/959c8c00-cbce-41a1-8b5b-85d5885bda82-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.739157 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-qhdr6" event={"ID":"959c8c00-cbce-41a1-8b5b-85d5885bda82","Type":"ContainerDied","Data":"e78d914b40c9d4b6bae88daa9dc8fff2a1b517107b09183856b86c6b919c9c1c"} Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.739219 4849 scope.go:117] "RemoveContainer" containerID="6491f273c292616d8b86ff0b8c53113303e592f9a04760b6e490dca0bb172dd7" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.739354 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-qhdr6" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.762851 4849 scope.go:117] "RemoveContainer" containerID="311477fbb4048d73b18530d45a84270d4c482e1d4f9ffd34ef9d17fdc4323edf" Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.768752 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhdr6"] Mar 20 13:43:48 crc kubenswrapper[4849]: I0320 13:43:48.774890 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-qhdr6"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.045483 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" path="/var/lib/kubelet/pods/959c8c00-cbce-41a1-8b5b-85d5885bda82/volumes" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.169981 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.215455 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.520690 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gqdxs"] Mar 20 13:43:49 crc kubenswrapper[4849]: E0320 13:43:49.521117 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerName="dnsmasq-dns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.521144 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerName="dnsmasq-dns" Mar 20 13:43:49 crc kubenswrapper[4849]: E0320 13:43:49.521162 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerName="init" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.521170 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerName="init" Mar 20 13:43:49 crc kubenswrapper[4849]: E0320 13:43:49.521187 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerName="init" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.521228 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerName="init" Mar 20 13:43:49 crc kubenswrapper[4849]: E0320 13:43:49.521259 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerName="dnsmasq-dns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.521268 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerName="dnsmasq-dns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.521459 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8ec0c5-c17e-4c48-88ad-5dc7b6d626be" containerName="dnsmasq-dns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.521489 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c8c00-cbce-41a1-8b5b-85d5885bda82" containerName="dnsmasq-dns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.522133 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.538198 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gqdxs"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.628238 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2589-account-create-update-q8rmz"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.629165 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.631202 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.644427 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2589-account-create-update-q8rmz"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.679908 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nq5b\" (UniqueName: \"kubernetes.io/projected/5b5fb05e-2f40-432f-acf5-068f32e62698-kube-api-access-2nq5b\") pod \"cinder-db-create-gqdxs\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.679994 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5fb05e-2f40-432f-acf5-068f32e62698-operator-scripts\") pod \"cinder-db-create-gqdxs\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.717080 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bxj2z"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.718108 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.734514 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bxj2z"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.781891 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-operator-scripts\") pod \"cinder-2589-account-create-update-q8rmz\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.781960 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nq5b\" (UniqueName: \"kubernetes.io/projected/5b5fb05e-2f40-432f-acf5-068f32e62698-kube-api-access-2nq5b\") pod \"cinder-db-create-gqdxs\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.782031 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5fb05e-2f40-432f-acf5-068f32e62698-operator-scripts\") pod \"cinder-db-create-gqdxs\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.782082 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7kh5\" (UniqueName: \"kubernetes.io/projected/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-kube-api-access-w7kh5\") pod \"cinder-2589-account-create-update-q8rmz\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.783106 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5fb05e-2f40-432f-acf5-068f32e62698-operator-scripts\") pod \"cinder-db-create-gqdxs\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.798545 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nq5b\" (UniqueName: \"kubernetes.io/projected/5b5fb05e-2f40-432f-acf5-068f32e62698-kube-api-access-2nq5b\") pod \"cinder-db-create-gqdxs\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.832295 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-jdcns"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.833322 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.839550 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1a7e-account-create-update-lqd64"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.840686 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.842067 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.846167 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a7e-account-create-update-lqd64"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.849181 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.854261 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jdcns"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.893873 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f67cr\" (UniqueName: \"kubernetes.io/projected/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-kube-api-access-f67cr\") pod \"barbican-db-create-bxj2z\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.894021 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7kh5\" (UniqueName: \"kubernetes.io/projected/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-kube-api-access-w7kh5\") pod \"cinder-2589-account-create-update-q8rmz\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.894053 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-operator-scripts\") pod \"barbican-db-create-bxj2z\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.894190 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-operator-scripts\") pod \"cinder-2589-account-create-update-q8rmz\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.895296 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-operator-scripts\") pod \"cinder-2589-account-create-update-q8rmz\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.928976 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xvqnp"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.930109 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.933485 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7kh5\" (UniqueName: \"kubernetes.io/projected/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-kube-api-access-w7kh5\") pod \"cinder-2589-account-create-update-q8rmz\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.935172 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.935779 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.936224 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5zkb" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.936449 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.943581 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xvqnp"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.946220 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.954001 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e6b6-account-create-update-bxjkb"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.955065 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.983423 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e6b6-account-create-update-bxjkb"] Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.997698 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-operator-scripts\") pod \"neutron-1a7e-account-create-update-lqd64\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.997795 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f67cr\" (UniqueName: \"kubernetes.io/projected/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-kube-api-access-f67cr\") pod \"barbican-db-create-bxj2z\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.997841 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmdj4\" (UniqueName: \"kubernetes.io/projected/3d1422a9-d5f8-4349-8513-0bd372fa8500-kube-api-access-hmdj4\") pod \"neutron-db-create-jdcns\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.997954 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-operator-scripts\") pod \"barbican-db-create-bxj2z\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.997986 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1422a9-d5f8-4349-8513-0bd372fa8500-operator-scripts\") pod \"neutron-db-create-jdcns\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.998089 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwlj\" (UniqueName: \"kubernetes.io/projected/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-kube-api-access-rwwlj\") pod \"neutron-1a7e-account-create-update-lqd64\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:49 crc kubenswrapper[4849]: I0320 13:43:49.998919 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-operator-scripts\") pod \"barbican-db-create-bxj2z\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.004309 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.020866 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f67cr\" (UniqueName: \"kubernetes.io/projected/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-kube-api-access-f67cr\") pod \"barbican-db-create-bxj2z\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.031617 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.099389 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-config-data\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.099442 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmdj4\" (UniqueName: \"kubernetes.io/projected/3d1422a9-d5f8-4349-8513-0bd372fa8500-kube-api-access-hmdj4\") pod \"neutron-db-create-jdcns\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.099595 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tqh6\" (UniqueName: \"kubernetes.io/projected/05bc515e-52ae-4e93-b967-a458d135ae12-kube-api-access-7tqh6\") pod \"barbican-e6b6-account-create-update-bxjkb\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.099713 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt57g\" (UniqueName: \"kubernetes.io/projected/6129a249-c10a-4299-90c6-147c58b4926e-kube-api-access-bt57g\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.099747 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1422a9-d5f8-4349-8513-0bd372fa8500-operator-scripts\") pod \"neutron-db-create-jdcns\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.099884 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwlj\" (UniqueName: \"kubernetes.io/projected/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-kube-api-access-rwwlj\") pod \"neutron-1a7e-account-create-update-lqd64\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.100208 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-combined-ca-bundle\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.100246 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc515e-52ae-4e93-b967-a458d135ae12-operator-scripts\") pod \"barbican-e6b6-account-create-update-bxjkb\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.100326 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-operator-scripts\") pod \"neutron-1a7e-account-create-update-lqd64\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.100410 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1422a9-d5f8-4349-8513-0bd372fa8500-operator-scripts\") pod \"neutron-db-create-jdcns\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.100954 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-operator-scripts\") pod \"neutron-1a7e-account-create-update-lqd64\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.114099 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmdj4\" (UniqueName: \"kubernetes.io/projected/3d1422a9-d5f8-4349-8513-0bd372fa8500-kube-api-access-hmdj4\") pod \"neutron-db-create-jdcns\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.126861 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwlj\" (UniqueName: \"kubernetes.io/projected/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-kube-api-access-rwwlj\") pod \"neutron-1a7e-account-create-update-lqd64\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.208732 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-combined-ca-bundle\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.208800 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc515e-52ae-4e93-b967-a458d135ae12-operator-scripts\") pod \"barbican-e6b6-account-create-update-bxjkb\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.208877 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-config-data\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.208911 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tqh6\" (UniqueName: \"kubernetes.io/projected/05bc515e-52ae-4e93-b967-a458d135ae12-kube-api-access-7tqh6\") pod \"barbican-e6b6-account-create-update-bxjkb\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.208948 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt57g\" (UniqueName: \"kubernetes.io/projected/6129a249-c10a-4299-90c6-147c58b4926e-kube-api-access-bt57g\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.212753 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-combined-ca-bundle\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.213251 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc515e-52ae-4e93-b967-a458d135ae12-operator-scripts\") pod \"barbican-e6b6-account-create-update-bxjkb\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.222608 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-config-data\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.237438 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt57g\" (UniqueName: \"kubernetes.io/projected/6129a249-c10a-4299-90c6-147c58b4926e-kube-api-access-bt57g\") pod \"keystone-db-sync-xvqnp\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.245365 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tqh6\" (UniqueName: \"kubernetes.io/projected/05bc515e-52ae-4e93-b967-a458d135ae12-kube-api-access-7tqh6\") pod \"barbican-e6b6-account-create-update-bxjkb\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.284209 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.305146 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.329229 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.332753 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.409812 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bxj2z"] Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.423610 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gqdxs"] Mar 20 13:43:50 crc kubenswrapper[4849]: W0320 13:43:50.446788 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b5fb05e_2f40_432f_acf5_068f32e62698.slice/crio-a6940e604d0cd58f751187bbd913d94c011edefa1e5a5c87a44509e29937e15c WatchSource:0}: Error finding container a6940e604d0cd58f751187bbd913d94c011edefa1e5a5c87a44509e29937e15c: Status 404 returned error can't find the container with id a6940e604d0cd58f751187bbd913d94c011edefa1e5a5c87a44509e29937e15c Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.498289 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2589-account-create-update-q8rmz"] Mar 20 13:43:50 crc kubenswrapper[4849]: W0320 13:43:50.530773 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6155d2c7_33f5_4bbb_b6a0_a378848a08e5.slice/crio-866cb5e958eab4e3118de97aafbe463b9a4480c9a422b68e72d7fe50e09c5f70 WatchSource:0}: Error finding container 866cb5e958eab4e3118de97aafbe463b9a4480c9a422b68e72d7fe50e09c5f70: Status 404 returned error can't find the container with id 866cb5e958eab4e3118de97aafbe463b9a4480c9a422b68e72d7fe50e09c5f70 Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.758941 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gqdxs" event={"ID":"5b5fb05e-2f40-432f-acf5-068f32e62698","Type":"ContainerStarted","Data":"b724a256287ef2cf0c43f3ff39747afd41fe83ec47e27847686868b9aebdfe10"} Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.758984 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gqdxs" event={"ID":"5b5fb05e-2f40-432f-acf5-068f32e62698","Type":"ContainerStarted","Data":"a6940e604d0cd58f751187bbd913d94c011edefa1e5a5c87a44509e29937e15c"} Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.765747 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2589-account-create-update-q8rmz" event={"ID":"6155d2c7-33f5-4bbb-b6a0-a378848a08e5","Type":"ContainerStarted","Data":"7accee13eab298a882b99f170b6f179d6166c223602c4f9609540aad21013052"} Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.765796 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2589-account-create-update-q8rmz" event={"ID":"6155d2c7-33f5-4bbb-b6a0-a378848a08e5","Type":"ContainerStarted","Data":"866cb5e958eab4e3118de97aafbe463b9a4480c9a422b68e72d7fe50e09c5f70"} Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.772180 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxj2z" event={"ID":"66983f57-dfbe-4c47-90f6-9eef82ebd9a1","Type":"ContainerStarted","Data":"dd1b63c949713675b1f02a553c25e128ba4e4f58d0e7c7479148e78cc5bf860b"} Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.772235 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxj2z" event={"ID":"66983f57-dfbe-4c47-90f6-9eef82ebd9a1","Type":"ContainerStarted","Data":"fc593cf1c95fad903c8c277715afaff1d7ff42e45c6084b4052cd40e85f8deb1"} Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.804149 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2589-account-create-update-q8rmz" podStartSLOduration=1.8041348990000001 podStartE2EDuration="1.804134899s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:50.803759209 +0000 UTC m=+1180.481482604" watchObservedRunningTime="2026-03-20 13:43:50.804134899 +0000 UTC m=+1180.481858294" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.805655 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-gqdxs" podStartSLOduration=1.805646729 podStartE2EDuration="1.805646729s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:50.784319394 +0000 UTC m=+1180.462042829" watchObservedRunningTime="2026-03-20 13:43:50.805646729 +0000 UTC m=+1180.483370124" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.865637 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-bxj2z" podStartSLOduration=1.865613747 podStartE2EDuration="1.865613747s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:50.822215178 +0000 UTC m=+1180.499938563" watchObservedRunningTime="2026-03-20 13:43:50.865613747 +0000 UTC m=+1180.543337142" Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.868983 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1a7e-account-create-update-lqd64"] Mar 20 13:43:50 crc kubenswrapper[4849]: W0320 13:43:50.875246 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01152ef5_8fb0_44bd_aa3d_6a1e8a4e2f1c.slice/crio-f312bb330b661a1e6a0b68d8fdf23085ebd07f53f00f388b4c8898f86f1ad606 WatchSource:0}: Error finding container f312bb330b661a1e6a0b68d8fdf23085ebd07f53f00f388b4c8898f86f1ad606: Status 404 returned error can't find the container with id f312bb330b661a1e6a0b68d8fdf23085ebd07f53f00f388b4c8898f86f1ad606 Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.878358 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-jdcns"] Mar 20 13:43:50 crc kubenswrapper[4849]: I0320 13:43:50.885886 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xvqnp"] Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.060004 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e6b6-account-create-update-bxjkb"] Mar 20 13:43:51 crc kubenswrapper[4849]: W0320 13:43:51.155351 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05bc515e_52ae_4e93_b967_a458d135ae12.slice/crio-47ada1a95ef5c99721fba35df03928c1f658af72c97981250c998055e01d475d WatchSource:0}: Error finding container 47ada1a95ef5c99721fba35df03928c1f658af72c97981250c998055e01d475d: Status 404 returned error can't find the container with id 47ada1a95ef5c99721fba35df03928c1f658af72c97981250c998055e01d475d Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.791386 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jdcns" event={"ID":"3d1422a9-d5f8-4349-8513-0bd372fa8500","Type":"ContainerStarted","Data":"c406957d2efda55ad10cbef7a42d1d202eeb9461998ac4439beb54986ace02f6"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.791905 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jdcns" event={"ID":"3d1422a9-d5f8-4349-8513-0bd372fa8500","Type":"ContainerStarted","Data":"b061a031532cf2c2ececaf7ef394d55189e305a911b40bd09cff06c7d19dad6a"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.795042 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a7e-account-create-update-lqd64" event={"ID":"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c","Type":"ContainerStarted","Data":"2aceb57fe05e593b5cca1609dc37957848d7dfe43e49ce228d6c9593c262551c"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.795078 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a7e-account-create-update-lqd64" event={"ID":"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c","Type":"ContainerStarted","Data":"f312bb330b661a1e6a0b68d8fdf23085ebd07f53f00f388b4c8898f86f1ad606"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.798053 4849 generic.go:334] "Generic (PLEG): container finished" podID="6155d2c7-33f5-4bbb-b6a0-a378848a08e5" containerID="7accee13eab298a882b99f170b6f179d6166c223602c4f9609540aad21013052" exitCode=0 Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.798115 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2589-account-create-update-q8rmz" event={"ID":"6155d2c7-33f5-4bbb-b6a0-a378848a08e5","Type":"ContainerDied","Data":"7accee13eab298a882b99f170b6f179d6166c223602c4f9609540aad21013052"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.799493 4849 generic.go:334] "Generic (PLEG): container finished" podID="66983f57-dfbe-4c47-90f6-9eef82ebd9a1" containerID="dd1b63c949713675b1f02a553c25e128ba4e4f58d0e7c7479148e78cc5bf860b" exitCode=0 Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.799558 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxj2z" event={"ID":"66983f57-dfbe-4c47-90f6-9eef82ebd9a1","Type":"ContainerDied","Data":"dd1b63c949713675b1f02a553c25e128ba4e4f58d0e7c7479148e78cc5bf860b"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.800976 4849 generic.go:334] "Generic (PLEG): container finished" podID="5b5fb05e-2f40-432f-acf5-068f32e62698" containerID="b724a256287ef2cf0c43f3ff39747afd41fe83ec47e27847686868b9aebdfe10" exitCode=0 Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.801034 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gqdxs" event={"ID":"5b5fb05e-2f40-432f-acf5-068f32e62698","Type":"ContainerDied","Data":"b724a256287ef2cf0c43f3ff39747afd41fe83ec47e27847686868b9aebdfe10"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.803831 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6b6-account-create-update-bxjkb" event={"ID":"05bc515e-52ae-4e93-b967-a458d135ae12","Type":"ContainerStarted","Data":"d3013a21b55879ce171f0042ee5d7803ef2c391a92af0feebb0c4c7ca517b993"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.803864 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6b6-account-create-update-bxjkb" event={"ID":"05bc515e-52ae-4e93-b967-a458d135ae12","Type":"ContainerStarted","Data":"47ada1a95ef5c99721fba35df03928c1f658af72c97981250c998055e01d475d"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.804709 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xvqnp" event={"ID":"6129a249-c10a-4299-90c6-147c58b4926e","Type":"ContainerStarted","Data":"2ce4fa480cc178fa895833e742a8060e18d4e685982c577356e4e4581d5371cb"} Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.842478 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-jdcns" podStartSLOduration=2.8424427679999997 podStartE2EDuration="2.842442768s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:51.811160619 +0000 UTC m=+1181.488884034" watchObservedRunningTime="2026-03-20 13:43:51.842442768 +0000 UTC m=+1181.520166163" Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.843254 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-1a7e-account-create-update-lqd64" podStartSLOduration=2.843243549 podStartE2EDuration="2.843243549s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:51.833163492 +0000 UTC m=+1181.510886887" watchObservedRunningTime="2026-03-20 13:43:51.843243549 +0000 UTC m=+1181.520966944" Mar 20 13:43:51 crc kubenswrapper[4849]: I0320 13:43:51.913380 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-e6b6-account-create-update-bxjkb" podStartSLOduration=2.913355286 podStartE2EDuration="2.913355286s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:51.909196496 +0000 UTC m=+1181.586919891" watchObservedRunningTime="2026-03-20 13:43:51.913355286 +0000 UTC m=+1181.591078681" Mar 20 13:43:52 crc kubenswrapper[4849]: I0320 13:43:52.814422 4849 generic.go:334] "Generic (PLEG): container finished" podID="3d1422a9-d5f8-4349-8513-0bd372fa8500" containerID="c406957d2efda55ad10cbef7a42d1d202eeb9461998ac4439beb54986ace02f6" exitCode=0 Mar 20 13:43:52 crc kubenswrapper[4849]: I0320 13:43:52.814524 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jdcns" event={"ID":"3d1422a9-d5f8-4349-8513-0bd372fa8500","Type":"ContainerDied","Data":"c406957d2efda55ad10cbef7a42d1d202eeb9461998ac4439beb54986ace02f6"} Mar 20 13:43:52 crc kubenswrapper[4849]: I0320 13:43:52.819390 4849 generic.go:334] "Generic (PLEG): container finished" podID="01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" containerID="2aceb57fe05e593b5cca1609dc37957848d7dfe43e49ce228d6c9593c262551c" exitCode=0 Mar 20 13:43:52 crc kubenswrapper[4849]: I0320 13:43:52.819453 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a7e-account-create-update-lqd64" event={"ID":"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c","Type":"ContainerDied","Data":"2aceb57fe05e593b5cca1609dc37957848d7dfe43e49ce228d6c9593c262551c"} Mar 20 13:43:52 crc kubenswrapper[4849]: I0320 13:43:52.821153 4849 generic.go:334] "Generic (PLEG): container finished" podID="05bc515e-52ae-4e93-b967-a458d135ae12" containerID="d3013a21b55879ce171f0042ee5d7803ef2c391a92af0feebb0c4c7ca517b993" exitCode=0 Mar 20 13:43:52 crc kubenswrapper[4849]: I0320 13:43:52.821204 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6b6-account-create-update-bxjkb" event={"ID":"05bc515e-52ae-4e93-b967-a458d135ae12","Type":"ContainerDied","Data":"d3013a21b55879ce171f0042ee5d7803ef2c391a92af0feebb0c4c7ca517b993"} Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.344974 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.352991 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.365935 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.476768 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-operator-scripts\") pod \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.476978 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5fb05e-2f40-432f-acf5-068f32e62698-operator-scripts\") pod \"5b5fb05e-2f40-432f-acf5-068f32e62698\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.477051 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nq5b\" (UniqueName: \"kubernetes.io/projected/5b5fb05e-2f40-432f-acf5-068f32e62698-kube-api-access-2nq5b\") pod \"5b5fb05e-2f40-432f-acf5-068f32e62698\" (UID: \"5b5fb05e-2f40-432f-acf5-068f32e62698\") " Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.477097 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f67cr\" (UniqueName: \"kubernetes.io/projected/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-kube-api-access-f67cr\") pod \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.477173 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7kh5\" (UniqueName: \"kubernetes.io/projected/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-kube-api-access-w7kh5\") pod \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\" (UID: \"6155d2c7-33f5-4bbb-b6a0-a378848a08e5\") " Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.477231 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-operator-scripts\") pod \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\" (UID: \"66983f57-dfbe-4c47-90f6-9eef82ebd9a1\") " Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.477307 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6155d2c7-33f5-4bbb-b6a0-a378848a08e5" (UID: "6155d2c7-33f5-4bbb-b6a0-a378848a08e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.477585 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.478171 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66983f57-dfbe-4c47-90f6-9eef82ebd9a1" (UID: "66983f57-dfbe-4c47-90f6-9eef82ebd9a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.478660 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b5fb05e-2f40-432f-acf5-068f32e62698-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b5fb05e-2f40-432f-acf5-068f32e62698" (UID: "5b5fb05e-2f40-432f-acf5-068f32e62698"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.483390 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5fb05e-2f40-432f-acf5-068f32e62698-kube-api-access-2nq5b" (OuterVolumeSpecName: "kube-api-access-2nq5b") pod "5b5fb05e-2f40-432f-acf5-068f32e62698" (UID: "5b5fb05e-2f40-432f-acf5-068f32e62698"). InnerVolumeSpecName "kube-api-access-2nq5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.489361 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-kube-api-access-f67cr" (OuterVolumeSpecName: "kube-api-access-f67cr") pod "66983f57-dfbe-4c47-90f6-9eef82ebd9a1" (UID: "66983f57-dfbe-4c47-90f6-9eef82ebd9a1"). InnerVolumeSpecName "kube-api-access-f67cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.493591 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-kube-api-access-w7kh5" (OuterVolumeSpecName: "kube-api-access-w7kh5") pod "6155d2c7-33f5-4bbb-b6a0-a378848a08e5" (UID: "6155d2c7-33f5-4bbb-b6a0-a378848a08e5"). InnerVolumeSpecName "kube-api-access-w7kh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.579406 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b5fb05e-2f40-432f-acf5-068f32e62698-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.579438 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nq5b\" (UniqueName: \"kubernetes.io/projected/5b5fb05e-2f40-432f-acf5-068f32e62698-kube-api-access-2nq5b\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.579450 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f67cr\" (UniqueName: \"kubernetes.io/projected/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-kube-api-access-f67cr\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.579459 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7kh5\" (UniqueName: \"kubernetes.io/projected/6155d2c7-33f5-4bbb-b6a0-a378848a08e5-kube-api-access-w7kh5\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.579468 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66983f57-dfbe-4c47-90f6-9eef82ebd9a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.848737 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2589-account-create-update-q8rmz" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.848828 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2589-account-create-update-q8rmz" event={"ID":"6155d2c7-33f5-4bbb-b6a0-a378848a08e5","Type":"ContainerDied","Data":"866cb5e958eab4e3118de97aafbe463b9a4480c9a422b68e72d7fe50e09c5f70"} Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.848867 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866cb5e958eab4e3118de97aafbe463b9a4480c9a422b68e72d7fe50e09c5f70" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.850990 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bxj2z" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.851152 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bxj2z" event={"ID":"66983f57-dfbe-4c47-90f6-9eef82ebd9a1","Type":"ContainerDied","Data":"fc593cf1c95fad903c8c277715afaff1d7ff42e45c6084b4052cd40e85f8deb1"} Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.851188 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc593cf1c95fad903c8c277715afaff1d7ff42e45c6084b4052cd40e85f8deb1" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.854812 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gqdxs" event={"ID":"5b5fb05e-2f40-432f-acf5-068f32e62698","Type":"ContainerDied","Data":"a6940e604d0cd58f751187bbd913d94c011edefa1e5a5c87a44509e29937e15c"} Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.854900 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6940e604d0cd58f751187bbd913d94c011edefa1e5a5c87a44509e29937e15c" Mar 20 13:43:53 crc kubenswrapper[4849]: I0320 13:43:53.855031 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gqdxs" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.404351 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.407863 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.429828 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.525635 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmdj4\" (UniqueName: \"kubernetes.io/projected/3d1422a9-d5f8-4349-8513-0bd372fa8500-kube-api-access-hmdj4\") pod \"3d1422a9-d5f8-4349-8513-0bd372fa8500\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.526021 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwwlj\" (UniqueName: \"kubernetes.io/projected/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-kube-api-access-rwwlj\") pod \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.526062 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-operator-scripts\") pod \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\" (UID: \"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c\") " Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.526124 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1422a9-d5f8-4349-8513-0bd372fa8500-operator-scripts\") pod \"3d1422a9-d5f8-4349-8513-0bd372fa8500\" (UID: \"3d1422a9-d5f8-4349-8513-0bd372fa8500\") " Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.526400 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc515e-52ae-4e93-b967-a458d135ae12-operator-scripts\") pod \"05bc515e-52ae-4e93-b967-a458d135ae12\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.526427 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tqh6\" (UniqueName: \"kubernetes.io/projected/05bc515e-52ae-4e93-b967-a458d135ae12-kube-api-access-7tqh6\") pod \"05bc515e-52ae-4e93-b967-a458d135ae12\" (UID: \"05bc515e-52ae-4e93-b967-a458d135ae12\") " Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.526839 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" (UID: "01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.527080 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05bc515e-52ae-4e93-b967-a458d135ae12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05bc515e-52ae-4e93-b967-a458d135ae12" (UID: "05bc515e-52ae-4e93-b967-a458d135ae12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.527770 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d1422a9-d5f8-4349-8513-0bd372fa8500-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d1422a9-d5f8-4349-8513-0bd372fa8500" (UID: "3d1422a9-d5f8-4349-8513-0bd372fa8500"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.530758 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05bc515e-52ae-4e93-b967-a458d135ae12-kube-api-access-7tqh6" (OuterVolumeSpecName: "kube-api-access-7tqh6") pod "05bc515e-52ae-4e93-b967-a458d135ae12" (UID: "05bc515e-52ae-4e93-b967-a458d135ae12"). InnerVolumeSpecName "kube-api-access-7tqh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.536754 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d1422a9-d5f8-4349-8513-0bd372fa8500-kube-api-access-hmdj4" (OuterVolumeSpecName: "kube-api-access-hmdj4") pod "3d1422a9-d5f8-4349-8513-0bd372fa8500" (UID: "3d1422a9-d5f8-4349-8513-0bd372fa8500"). InnerVolumeSpecName "kube-api-access-hmdj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.538106 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-kube-api-access-rwwlj" (OuterVolumeSpecName: "kube-api-access-rwwlj") pod "01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" (UID: "01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c"). InnerVolumeSpecName "kube-api-access-rwwlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.628698 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmdj4\" (UniqueName: \"kubernetes.io/projected/3d1422a9-d5f8-4349-8513-0bd372fa8500-kube-api-access-hmdj4\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.628978 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwwlj\" (UniqueName: \"kubernetes.io/projected/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-kube-api-access-rwwlj\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.629076 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.629150 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d1422a9-d5f8-4349-8513-0bd372fa8500-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.629206 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bc515e-52ae-4e93-b967-a458d135ae12-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.629277 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tqh6\" (UniqueName: \"kubernetes.io/projected/05bc515e-52ae-4e93-b967-a458d135ae12-kube-api-access-7tqh6\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.877079 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e6b6-account-create-update-bxjkb" event={"ID":"05bc515e-52ae-4e93-b967-a458d135ae12","Type":"ContainerDied","Data":"47ada1a95ef5c99721fba35df03928c1f658af72c97981250c998055e01d475d"} Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.877120 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47ada1a95ef5c99721fba35df03928c1f658af72c97981250c998055e01d475d" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.877231 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e6b6-account-create-update-bxjkb" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.879405 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xvqnp" event={"ID":"6129a249-c10a-4299-90c6-147c58b4926e","Type":"ContainerStarted","Data":"49d2c53933c500e513c072ee38587bcca5460421ba255a31fb2a0a36d5b59b82"} Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.881239 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-jdcns" event={"ID":"3d1422a9-d5f8-4349-8513-0bd372fa8500","Type":"ContainerDied","Data":"b061a031532cf2c2ececaf7ef394d55189e305a911b40bd09cff06c7d19dad6a"} Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.881370 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b061a031532cf2c2ececaf7ef394d55189e305a911b40bd09cff06c7d19dad6a" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.881272 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-jdcns" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.882735 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1a7e-account-create-update-lqd64" event={"ID":"01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c","Type":"ContainerDied","Data":"f312bb330b661a1e6a0b68d8fdf23085ebd07f53f00f388b4c8898f86f1ad606"} Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.882776 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f312bb330b661a1e6a0b68d8fdf23085ebd07f53f00f388b4c8898f86f1ad606" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.882833 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1a7e-account-create-update-lqd64" Mar 20 13:43:56 crc kubenswrapper[4849]: I0320 13:43:56.907190 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xvqnp" podStartSLOduration=2.514255125 podStartE2EDuration="7.907171642s" podCreationTimestamp="2026-03-20 13:43:49 +0000 UTC" firstStartedPulling="2026-03-20 13:43:50.901713083 +0000 UTC m=+1180.579436478" lastFinishedPulling="2026-03-20 13:43:56.2946296 +0000 UTC m=+1185.972352995" observedRunningTime="2026-03-20 13:43:56.904026519 +0000 UTC m=+1186.581749924" watchObservedRunningTime="2026-03-20 13:43:56.907171642 +0000 UTC m=+1186.584895037" Mar 20 13:43:59 crc kubenswrapper[4849]: I0320 13:43:59.912309 4849 generic.go:334] "Generic (PLEG): container finished" podID="6129a249-c10a-4299-90c6-147c58b4926e" containerID="49d2c53933c500e513c072ee38587bcca5460421ba255a31fb2a0a36d5b59b82" exitCode=0 Mar 20 13:43:59 crc kubenswrapper[4849]: I0320 13:43:59.912456 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xvqnp" event={"ID":"6129a249-c10a-4299-90c6-147c58b4926e","Type":"ContainerDied","Data":"49d2c53933c500e513c072ee38587bcca5460421ba255a31fb2a0a36d5b59b82"} Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.138442 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566904-gd8zb"] Mar 20 13:44:00 crc kubenswrapper[4849]: E0320 13:44:00.139194 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139221 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: E0320 13:44:00.139240 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5fb05e-2f40-432f-acf5-068f32e62698" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139250 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5fb05e-2f40-432f-acf5-068f32e62698" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: E0320 13:44:00.139264 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05bc515e-52ae-4e93-b967-a458d135ae12" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139272 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="05bc515e-52ae-4e93-b967-a458d135ae12" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: E0320 13:44:00.139287 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66983f57-dfbe-4c47-90f6-9eef82ebd9a1" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139295 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="66983f57-dfbe-4c47-90f6-9eef82ebd9a1" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: E0320 13:44:00.139315 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155d2c7-33f5-4bbb-b6a0-a378848a08e5" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139324 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155d2c7-33f5-4bbb-b6a0-a378848a08e5" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: E0320 13:44:00.139341 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d1422a9-d5f8-4349-8513-0bd372fa8500" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139349 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d1422a9-d5f8-4349-8513-0bd372fa8500" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139546 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d1422a9-d5f8-4349-8513-0bd372fa8500" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139563 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155d2c7-33f5-4bbb-b6a0-a378848a08e5" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139570 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="05bc515e-52ae-4e93-b967-a458d135ae12" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139577 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5fb05e-2f40-432f-acf5-068f32e62698" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139586 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" containerName="mariadb-account-create-update" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.139602 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="66983f57-dfbe-4c47-90f6-9eef82ebd9a1" containerName="mariadb-database-create" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.140170 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.143609 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.143666 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.145942 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-gd8zb"] Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.146020 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.283727 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8xlq\" (UniqueName: \"kubernetes.io/projected/3bec97e6-a0d7-484e-ba32-9469c22ff871-kube-api-access-j8xlq\") pod \"auto-csr-approver-29566904-gd8zb\" (UID: \"3bec97e6-a0d7-484e-ba32-9469c22ff871\") " pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.385457 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8xlq\" (UniqueName: \"kubernetes.io/projected/3bec97e6-a0d7-484e-ba32-9469c22ff871-kube-api-access-j8xlq\") pod \"auto-csr-approver-29566904-gd8zb\" (UID: \"3bec97e6-a0d7-484e-ba32-9469c22ff871\") " pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.405278 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8xlq\" (UniqueName: \"kubernetes.io/projected/3bec97e6-a0d7-484e-ba32-9469c22ff871-kube-api-access-j8xlq\") pod \"auto-csr-approver-29566904-gd8zb\" (UID: \"3bec97e6-a0d7-484e-ba32-9469c22ff871\") " pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:00 crc kubenswrapper[4849]: I0320 13:44:00.459417 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.027741 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-gd8zb"] Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.195146 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.233667 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt57g\" (UniqueName: \"kubernetes.io/projected/6129a249-c10a-4299-90c6-147c58b4926e-kube-api-access-bt57g\") pod \"6129a249-c10a-4299-90c6-147c58b4926e\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.233878 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-config-data\") pod \"6129a249-c10a-4299-90c6-147c58b4926e\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.235232 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-combined-ca-bundle\") pod \"6129a249-c10a-4299-90c6-147c58b4926e\" (UID: \"6129a249-c10a-4299-90c6-147c58b4926e\") " Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.239781 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6129a249-c10a-4299-90c6-147c58b4926e-kube-api-access-bt57g" (OuterVolumeSpecName: "kube-api-access-bt57g") pod "6129a249-c10a-4299-90c6-147c58b4926e" (UID: "6129a249-c10a-4299-90c6-147c58b4926e"). InnerVolumeSpecName "kube-api-access-bt57g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.259539 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6129a249-c10a-4299-90c6-147c58b4926e" (UID: "6129a249-c10a-4299-90c6-147c58b4926e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.282507 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-config-data" (OuterVolumeSpecName: "config-data") pod "6129a249-c10a-4299-90c6-147c58b4926e" (UID: "6129a249-c10a-4299-90c6-147c58b4926e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.336662 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt57g\" (UniqueName: \"kubernetes.io/projected/6129a249-c10a-4299-90c6-147c58b4926e-kube-api-access-bt57g\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.336690 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.336700 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6129a249-c10a-4299-90c6-147c58b4926e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.928636 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" event={"ID":"3bec97e6-a0d7-484e-ba32-9469c22ff871","Type":"ContainerStarted","Data":"d8136be18f3939e1eb064110cbb1552e264a81673baff7c8de444102067b1ff5"} Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.929899 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xvqnp" event={"ID":"6129a249-c10a-4299-90c6-147c58b4926e","Type":"ContainerDied","Data":"2ce4fa480cc178fa895833e742a8060e18d4e685982c577356e4e4581d5371cb"} Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.929924 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce4fa480cc178fa895833e742a8060e18d4e685982c577356e4e4581d5371cb" Mar 20 13:44:01 crc kubenswrapper[4849]: I0320 13:44:01.929970 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xvqnp" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.180394 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-h2xf9"] Mar 20 13:44:02 crc kubenswrapper[4849]: E0320 13:44:02.180949 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6129a249-c10a-4299-90c6-147c58b4926e" containerName="keystone-db-sync" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.180962 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6129a249-c10a-4299-90c6-147c58b4926e" containerName="keystone-db-sync" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.181117 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6129a249-c10a-4299-90c6-147c58b4926e" containerName="keystone-db-sync" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.181861 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.213729 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-h2xf9"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.254757 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-svc\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.255298 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.255403 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.255496 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-config\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.255518 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl2qk\" (UniqueName: \"kubernetes.io/projected/d338730e-90ce-44bc-9b2d-b03b3c596cf8-kube-api-access-fl2qk\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.255616 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.266105 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gz947"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.267564 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.282535 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.282863 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.283132 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5zkb" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.283304 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.288138 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.316178 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gz947"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360238 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-fernet-keys\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360280 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360334 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-config\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360355 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl2qk\" (UniqueName: \"kubernetes.io/projected/d338730e-90ce-44bc-9b2d-b03b3c596cf8-kube-api-access-fl2qk\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360389 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-credential-keys\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360407 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-config-data\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360427 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66xnv\" (UniqueName: \"kubernetes.io/projected/b2b9d8d1-1879-4871-85d9-2192274358ac-kube-api-access-66xnv\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360445 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360466 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-svc\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360501 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360529 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-scripts\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.360547 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-combined-ca-bundle\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.361716 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.362219 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-config\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.363456 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.364013 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-svc\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.368947 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.376159 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-79d4788db5-tz9b5"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.377548 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.385784 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.385968 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.386080 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.386176 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4n8ws" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.402332 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79d4788db5-tz9b5"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.438717 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl2qk\" (UniqueName: \"kubernetes.io/projected/d338730e-90ce-44bc-9b2d-b03b3c596cf8-kube-api-access-fl2qk\") pod \"dnsmasq-dns-847c4cc679-h2xf9\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.461806 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-credential-keys\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.461957 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-config-data\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.461987 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66xnv\" (UniqueName: \"kubernetes.io/projected/b2b9d8d1-1879-4871-85d9-2192274358ac-kube-api-access-66xnv\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462023 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-scripts\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462057 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-logs\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462109 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-scripts\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462136 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-combined-ca-bundle\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462171 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-fernet-keys\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462208 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-config-data\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462237 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2hj\" (UniqueName: \"kubernetes.io/projected/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-kube-api-access-9v2hj\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.462288 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-horizon-secret-key\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.466044 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-credential-keys\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.467391 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-combined-ca-bundle\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.470874 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-fernet-keys\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.472368 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-config-data\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.475355 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-scripts\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.515441 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.528279 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66xnv\" (UniqueName: \"kubernetes.io/projected/b2b9d8d1-1879-4871-85d9-2192274358ac-kube-api-access-66xnv\") pod \"keystone-bootstrap-gz947\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.528390 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-wh76b"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.529582 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.541372 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.541642 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8qbgb" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.541772 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.563515 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2hj\" (UniqueName: \"kubernetes.io/projected/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-kube-api-access-9v2hj\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.563588 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-horizon-secret-key\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.563637 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-scripts\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.563663 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-logs\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.563732 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-config-data\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.564935 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-config-data\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.567079 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-logs\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.567082 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-scripts\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.601843 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-horizon-secret-key\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.613480 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wh76b"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.621440 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2hj\" (UniqueName: \"kubernetes.io/projected/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-kube-api-access-9v2hj\") pod \"horizon-79d4788db5-tz9b5\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.637498 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.651883 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gwq28"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.653016 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.662048 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.662708 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-44cqd" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.670393 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-combined-ca-bundle\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.670458 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnf66\" (UniqueName: \"kubernetes.io/projected/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-kube-api-access-jnf66\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.670530 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-config\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.676561 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jk575"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.677554 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.682598 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-plg4l" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.682740 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.699365 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jk575"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.699966 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.706370 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.710831 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.714756 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.718249 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.718474 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.722912 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gwq28"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.735435 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74457cc7cf-gpm8s"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.737250 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.771686 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxcp\" (UniqueName: \"kubernetes.io/projected/ee9399c2-4755-4acd-8514-7d49cdd92f16-kube-api-access-7xxcp\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.771741 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-combined-ca-bundle\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.771769 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-combined-ca-bundle\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.771793 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnf66\" (UniqueName: \"kubernetes.io/projected/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-kube-api-access-jnf66\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.771849 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-db-sync-config-data\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.771908 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-config\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.799501 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-config\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.800097 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-combined-ca-bundle\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.818155 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-xbwzw"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.819354 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.829097 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.829540 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.829721 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rdl7q" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.832556 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnf66\" (UniqueName: \"kubernetes.io/projected/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-kube-api-access-jnf66\") pod \"neutron-db-sync-wh76b\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.856159 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74457cc7cf-gpm8s"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.873220 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-combined-ca-bundle\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.873429 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-scripts\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.873510 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-config-data\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.873589 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-db-sync-config-data\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.873942 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-horizon-secret-key\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874032 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-scripts\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874096 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-config-data\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874168 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-config-data\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874234 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-scripts\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874312 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchsg\" (UniqueName: \"kubernetes.io/projected/701dfbaa-ecac-4290-9402-90c866ccd108-kube-api-access-wchsg\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874379 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-run-httpd\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874439 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-combined-ca-bundle\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874515 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/701dfbaa-ecac-4290-9402-90c866ccd108-etc-machine-id\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874622 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-db-sync-config-data\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874702 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874811 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsdp\" (UniqueName: \"kubernetes.io/projected/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-kube-api-access-gtsdp\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.874981 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fthp2\" (UniqueName: \"kubernetes.io/projected/5e321362-ff76-4c26-bbde-9a97617ca460-kube-api-access-fthp2\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.875082 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-logs\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.875157 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-log-httpd\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.875276 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxcp\" (UniqueName: \"kubernetes.io/projected/ee9399c2-4755-4acd-8514-7d49cdd92f16-kube-api-access-7xxcp\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.875339 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.882889 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-combined-ca-bundle\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.895118 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.896643 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-db-sync-config-data\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.927665 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxcp\" (UniqueName: \"kubernetes.io/projected/ee9399c2-4755-4acd-8514-7d49cdd92f16-kube-api-access-7xxcp\") pod \"barbican-db-sync-gwq28\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.965754 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xbwzw"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977379 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-horizon-secret-key\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977421 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-scripts\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977441 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-config-data\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977466 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-config-data\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977480 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-scripts\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977509 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchsg\" (UniqueName: \"kubernetes.io/projected/701dfbaa-ecac-4290-9402-90c866ccd108-kube-api-access-wchsg\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977524 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-run-httpd\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977540 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-combined-ca-bundle\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977557 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/701dfbaa-ecac-4290-9402-90c866ccd108-etc-machine-id\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977579 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-db-sync-config-data\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977599 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977615 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsdp\" (UniqueName: \"kubernetes.io/projected/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-kube-api-access-gtsdp\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977631 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fthp2\" (UniqueName: \"kubernetes.io/projected/5e321362-ff76-4c26-bbde-9a97617ca460-kube-api-access-fthp2\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977652 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-scripts\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977667 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-logs\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977685 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-log-httpd\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977704 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977724 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39790e43-e227-4e13-8054-995e12255ec8-logs\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977770 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-combined-ca-bundle\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977787 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f7qs\" (UniqueName: \"kubernetes.io/projected/39790e43-e227-4e13-8054-995e12255ec8-kube-api-access-7f7qs\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977802 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-config-data\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977840 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-scripts\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.977874 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-config-data\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.981769 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-scripts\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.989380 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-combined-ca-bundle\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.989706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-run-httpd\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.990108 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/701dfbaa-ecac-4290-9402-90c866ccd108-etc-machine-id\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.991697 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-config-data\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.991911 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-logs\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.991982 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-log-httpd\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.992010 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-h2xf9"] Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.993307 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-config-data\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.996792 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.997621 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-config-data\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.998185 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-db-sync-config-data\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.998615 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-scripts\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:02 crc kubenswrapper[4849]: I0320 13:44:02.999272 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-scripts\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.003455 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-horizon-secret-key\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.007076 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" event={"ID":"3bec97e6-a0d7-484e-ba32-9469c22ff871","Type":"ContainerStarted","Data":"881216ee4bf96cb670a317b4944ff7febf607a77c4cbc839160bcb2d730ff126"} Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.013961 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lftjx"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.015295 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.015736 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.021201 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchsg\" (UniqueName: \"kubernetes.io/projected/701dfbaa-ecac-4290-9402-90c866ccd108-kube-api-access-wchsg\") pod \"cinder-db-sync-jk575\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.021682 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fthp2\" (UniqueName: \"kubernetes.io/projected/5e321362-ff76-4c26-bbde-9a97617ca460-kube-api-access-fthp2\") pod \"ceilometer-0\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " pod="openstack/ceilometer-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.031262 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lftjx"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.047937 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.054443 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.069351 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsdp\" (UniqueName: \"kubernetes.io/projected/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-kube-api-access-gtsdp\") pod \"horizon-74457cc7cf-gpm8s\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.076156 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.133992 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.179707 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.169363 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.169617 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.216542 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" podStartSLOduration=2.077086263 podStartE2EDuration="3.21651472s" podCreationTimestamp="2026-03-20 13:44:00 +0000 UTC" firstStartedPulling="2026-03-20 13:44:01.045647056 +0000 UTC m=+1190.723370451" lastFinishedPulling="2026-03-20 13:44:02.185075513 +0000 UTC m=+1191.862798908" observedRunningTime="2026-03-20 13:44:03.070205945 +0000 UTC m=+1192.747929360" watchObservedRunningTime="2026-03-20 13:44:03.21651472 +0000 UTC m=+1192.894238125" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.139629 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-scripts\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.217070 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39790e43-e227-4e13-8054-995e12255ec8-logs\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.217103 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-combined-ca-bundle\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.217128 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f7qs\" (UniqueName: \"kubernetes.io/projected/39790e43-e227-4e13-8054-995e12255ec8-kube-api-access-7f7qs\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.217149 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-config-data\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.187125 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.222495 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.187088 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-scripts\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.224737 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-z4p48" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.225121 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.226308 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.230186 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39790e43-e227-4e13-8054-995e12255ec8-logs\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.235699 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-combined-ca-bundle\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.249256 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-config-data\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.266706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f7qs\" (UniqueName: \"kubernetes.io/projected/39790e43-e227-4e13-8054-995e12255ec8-kube-api-access-7f7qs\") pod \"placement-db-sync-xbwzw\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.323701 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.323780 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.323803 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-config\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.323870 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.324000 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.324036 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttq4h\" (UniqueName: \"kubernetes.io/projected/deeae318-4f97-4be7-90dc-c63f22dcf3a6-kube-api-access-ttq4h\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.402901 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-h2xf9"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.434122 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.434201 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.434227 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.434321 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.434359 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.434973 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttq4h\" (UniqueName: \"kubernetes.io/projected/deeae318-4f97-4be7-90dc-c63f22dcf3a6-kube-api-access-ttq4h\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435133 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5hs\" (UniqueName: \"kubernetes.io/projected/a9b12460-0ac3-446b-a794-6dd376201333-kube-api-access-ht5hs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435251 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435307 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435334 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435432 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435480 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-config\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435536 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435551 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.435624 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.436602 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.437112 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.439433 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.440056 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-config\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.455219 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gz947"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.472971 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.478591 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttq4h\" (UniqueName: \"kubernetes.io/projected/deeae318-4f97-4be7-90dc-c63f22dcf3a6-kube-api-access-ttq4h\") pod \"dnsmasq-dns-785d8bcb8c-lftjx\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.524063 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.526440 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.541948 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542199 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542212 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542272 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542342 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542382 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542417 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.542439 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.543972 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.544038 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5hs\" (UniqueName: \"kubernetes.io/projected/a9b12460-0ac3-446b-a794-6dd376201333-kube-api-access-ht5hs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.549513 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.551539 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.552675 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.552907 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.553164 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.564749 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.575004 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.587005 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.593850 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5hs\" (UniqueName: \"kubernetes.io/projected/a9b12460-0ac3-446b-a794-6dd376201333-kube-api-access-ht5hs\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.629000 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646229 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-logs\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646556 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-scripts\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646590 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646614 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646639 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646693 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646864 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmc7d\" (UniqueName: \"kubernetes.io/projected/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-kube-api-access-tmc7d\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.646882 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-config-data\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.674778 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-79d4788db5-tz9b5"] Mar 20 13:44:03 crc kubenswrapper[4849]: W0320 13:44:03.738056 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef4d07e7_fc99_4d1c_b424_01dd7a58edd7.slice/crio-99786e49ab601b6814a3f65c923db713f4f2d094ca56a6e90168316dd17d6bcd WatchSource:0}: Error finding container 99786e49ab601b6814a3f65c923db713f4f2d094ca56a6e90168316dd17d6bcd: Status 404 returned error can't find the container with id 99786e49ab601b6814a3f65c923db713f4f2d094ca56a6e90168316dd17d6bcd Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748166 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748196 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmc7d\" (UniqueName: \"kubernetes.io/projected/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-kube-api-access-tmc7d\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748216 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-config-data\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748280 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-logs\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748312 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-scripts\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748343 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748367 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.748388 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.749562 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.749628 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-logs\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.750797 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.759438 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.760342 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-config-data\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.767889 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.767996 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.777531 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-scripts\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.784319 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmc7d\" (UniqueName: \"kubernetes.io/projected/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-kube-api-access-tmc7d\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.812592 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:03 crc kubenswrapper[4849]: I0320 13:44:03.855263 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.017489 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz947" event={"ID":"b2b9d8d1-1879-4871-85d9-2192274358ac","Type":"ContainerStarted","Data":"a19c96162d1f1d6aa4932ff050024bf930cd0e515ebb07895568780b8b23af2c"} Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.017543 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz947" event={"ID":"b2b9d8d1-1879-4871-85d9-2192274358ac","Type":"ContainerStarted","Data":"66d5e6a8800575a9e8e35f23ba0eef5530bce9fffaef26bba4e56f1160033d45"} Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.023865 4849 generic.go:334] "Generic (PLEG): container finished" podID="3bec97e6-a0d7-484e-ba32-9469c22ff871" containerID="881216ee4bf96cb670a317b4944ff7febf607a77c4cbc839160bcb2d730ff126" exitCode=0 Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.023925 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" event={"ID":"3bec97e6-a0d7-484e-ba32-9469c22ff871","Type":"ContainerDied","Data":"881216ee4bf96cb670a317b4944ff7febf607a77c4cbc839160bcb2d730ff126"} Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.027918 4849 generic.go:334] "Generic (PLEG): container finished" podID="d338730e-90ce-44bc-9b2d-b03b3c596cf8" containerID="6748d6221e2040afe7f27853bcfd1b881637aa8aef12e22341c2ffc391c4088a" exitCode=0 Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.028037 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" event={"ID":"d338730e-90ce-44bc-9b2d-b03b3c596cf8","Type":"ContainerDied","Data":"6748d6221e2040afe7f27853bcfd1b881637aa8aef12e22341c2ffc391c4088a"} Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.028074 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" event={"ID":"d338730e-90ce-44bc-9b2d-b03b3c596cf8","Type":"ContainerStarted","Data":"e959ebadc1db9a6351fea0378c2c51d4301f0a240b1912f19ec579ac84af49c0"} Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.031109 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d4788db5-tz9b5" event={"ID":"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7","Type":"ContainerStarted","Data":"99786e49ab601b6814a3f65c923db713f4f2d094ca56a6e90168316dd17d6bcd"} Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.067546 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gz947" podStartSLOduration=2.0675289980000002 podStartE2EDuration="2.067528998s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:04.047132117 +0000 UTC m=+1193.724855532" watchObservedRunningTime="2026-03-20 13:44:04.067528998 +0000 UTC m=+1193.745252383" Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.088965 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-wh76b"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.097209 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gwq28"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.097441 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.129421 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:04 crc kubenswrapper[4849]: W0320 13:44:04.130658 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a5cf24_7a8d_40f9_87cc_0b9b6533e520.slice/crio-08a57355a02cf30937326c6d254846fd581ca4c68e7f05672142639968c2ecaf WatchSource:0}: Error finding container 08a57355a02cf30937326c6d254846fd581ca4c68e7f05672142639968c2ecaf: Status 404 returned error can't find the container with id 08a57355a02cf30937326c6d254846fd581ca4c68e7f05672142639968c2ecaf Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.142451 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jk575"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.289078 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74457cc7cf-gpm8s"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.451566 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-xbwzw"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.474900 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lftjx"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.580045 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.826774 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.923050 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.979375 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-config\") pod \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.979460 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-nb\") pod \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.979485 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-svc\") pod \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.979572 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-sb\") pod \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.979606 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-swift-storage-0\") pod \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.979669 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl2qk\" (UniqueName: \"kubernetes.io/projected/d338730e-90ce-44bc-9b2d-b03b3c596cf8-kube-api-access-fl2qk\") pod \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\" (UID: \"d338730e-90ce-44bc-9b2d-b03b3c596cf8\") " Mar 20 13:44:04 crc kubenswrapper[4849]: I0320 13:44:04.982963 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.011929 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74457cc7cf-gpm8s"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.017238 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.017536 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d338730e-90ce-44bc-9b2d-b03b3c596cf8-kube-api-access-fl2qk" (OuterVolumeSpecName: "kube-api-access-fl2qk") pod "d338730e-90ce-44bc-9b2d-b03b3c596cf8" (UID: "d338730e-90ce-44bc-9b2d-b03b3c596cf8"). InnerVolumeSpecName "kube-api-access-fl2qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.037915 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76b4654f4f-2sft4"] Mar 20 13:44:05 crc kubenswrapper[4849]: E0320 13:44:05.038525 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d338730e-90ce-44bc-9b2d-b03b3c596cf8" containerName="init" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.038543 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d338730e-90ce-44bc-9b2d-b03b3c596cf8" containerName="init" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.038700 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="d338730e-90ce-44bc-9b2d-b03b3c596cf8" containerName="init" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.041423 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.081527 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d338730e-90ce-44bc-9b2d-b03b3c596cf8" (UID: "d338730e-90ce-44bc-9b2d-b03b3c596cf8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.099774 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c58468-f82b-4d72-9a56-a9ebf39db54e-logs\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.099848 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-scripts\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.104611 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9c58468-f82b-4d72-9a56-a9ebf39db54e-horizon-secret-key\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.104656 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzpnw\" (UniqueName: \"kubernetes.io/projected/d9c58468-f82b-4d72-9a56-a9ebf39db54e-kube-api-access-mzpnw\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.104867 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-config-data\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.104938 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl2qk\" (UniqueName: \"kubernetes.io/projected/d338730e-90ce-44bc-9b2d-b03b3c596cf8-kube-api-access-fl2qk\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.104952 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.124648 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-config" (OuterVolumeSpecName: "config") pod "d338730e-90ce-44bc-9b2d-b03b3c596cf8" (UID: "d338730e-90ce-44bc-9b2d-b03b3c596cf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.124812 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d338730e-90ce-44bc-9b2d-b03b3c596cf8" (UID: "d338730e-90ce-44bc-9b2d-b03b3c596cf8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.132376 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d338730e-90ce-44bc-9b2d-b03b3c596cf8" (UID: "d338730e-90ce-44bc-9b2d-b03b3c596cf8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.148417 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d338730e-90ce-44bc-9b2d-b03b3c596cf8" (UID: "d338730e-90ce-44bc-9b2d-b03b3c596cf8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.155113 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76b4654f4f-2sft4"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.155147 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.163008 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerStarted","Data":"9654243ab408b918f14113295b644a6adcb4933523b20481b3368be5fd4aadea"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.164866 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwq28" event={"ID":"ee9399c2-4755-4acd-8514-7d49cdd92f16","Type":"ContainerStarted","Data":"75cc55c6f8b591eb4808ebe6dd3d5b4d3f3cf28a40b876aa3e18f863ab38e4b8"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.166354 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce","Type":"ContainerStarted","Data":"d3ddab40047a5a30291f96ae300e3f1976b6a1086baeff2596bebc16a3c3c206"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.167464 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jk575" event={"ID":"701dfbaa-ecac-4290-9402-90c866ccd108","Type":"ContainerStarted","Data":"71fb2390e3c750900edccd8105a8978199d55cf1005899cec2eb35f52717c3cb"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.168517 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9b12460-0ac3-446b-a794-6dd376201333","Type":"ContainerStarted","Data":"ca866ab1781b2a2609951a1ed02d98d20112885a1e28888a28f765cbae8c8be4"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.170201 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xbwzw" event={"ID":"39790e43-e227-4e13-8054-995e12255ec8","Type":"ContainerStarted","Data":"d9a9d571166c36fcb656e222c6eae2245c293836dbc168c8f5aa8a2f8dafaa40"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.173320 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" event={"ID":"deeae318-4f97-4be7-90dc-c63f22dcf3a6","Type":"ContainerStarted","Data":"2322908ae870a1181a7a176015c1e34cf2b59f39ecc865b7618029843a2d2d73"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.186428 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74457cc7cf-gpm8s" event={"ID":"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7","Type":"ContainerStarted","Data":"14f819c2bf87604def8b025a2a19a1c6482ca77f88cf1e7226fa5a16fd0add68"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.207977 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" event={"ID":"d338730e-90ce-44bc-9b2d-b03b3c596cf8","Type":"ContainerDied","Data":"e959ebadc1db9a6351fea0378c2c51d4301f0a240b1912f19ec579ac84af49c0"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.208031 4849 scope.go:117] "RemoveContainer" containerID="6748d6221e2040afe7f27853bcfd1b881637aa8aef12e22341c2ffc391c4088a" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.208171 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-h2xf9" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.209932 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-scripts\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.209993 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9c58468-f82b-4d72-9a56-a9ebf39db54e-horizon-secret-key\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210022 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzpnw\" (UniqueName: \"kubernetes.io/projected/d9c58468-f82b-4d72-9a56-a9ebf39db54e-kube-api-access-mzpnw\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210098 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-config-data\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210121 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c58468-f82b-4d72-9a56-a9ebf39db54e-logs\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210168 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210178 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210187 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210196 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d338730e-90ce-44bc-9b2d-b03b3c596cf8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.210511 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c58468-f82b-4d72-9a56-a9ebf39db54e-logs\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.211141 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-scripts\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.212154 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-config-data\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.218328 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9c58468-f82b-4d72-9a56-a9ebf39db54e-horizon-secret-key\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.247922 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzpnw\" (UniqueName: \"kubernetes.io/projected/d9c58468-f82b-4d72-9a56-a9ebf39db54e-kube-api-access-mzpnw\") pod \"horizon-76b4654f4f-2sft4\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.248220 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wh76b" event={"ID":"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520","Type":"ContainerStarted","Data":"7f3cb2db57e1ff97b4333f399e9f9b2263113c1cda171a7408e89a733fef5fd3"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.248259 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wh76b" event={"ID":"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520","Type":"ContainerStarted","Data":"08a57355a02cf30937326c6d254846fd581ca4c68e7f05672142639968c2ecaf"} Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.309280 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-wh76b" podStartSLOduration=3.309256944 podStartE2EDuration="3.309256944s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:05.279220768 +0000 UTC m=+1194.956944153" watchObservedRunningTime="2026-03-20 13:44:05.309256944 +0000 UTC m=+1194.986980349" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.365586 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-h2xf9"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.373507 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.374149 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-h2xf9"] Mar 20 13:44:05 crc kubenswrapper[4849]: I0320 13:44:05.909258 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.039927 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8xlq\" (UniqueName: \"kubernetes.io/projected/3bec97e6-a0d7-484e-ba32-9469c22ff871-kube-api-access-j8xlq\") pod \"3bec97e6-a0d7-484e-ba32-9469c22ff871\" (UID: \"3bec97e6-a0d7-484e-ba32-9469c22ff871\") " Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.045471 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bec97e6-a0d7-484e-ba32-9469c22ff871-kube-api-access-j8xlq" (OuterVolumeSpecName: "kube-api-access-j8xlq") pod "3bec97e6-a0d7-484e-ba32-9469c22ff871" (UID: "3bec97e6-a0d7-484e-ba32-9469c22ff871"). InnerVolumeSpecName "kube-api-access-j8xlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.058945 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76b4654f4f-2sft4"] Mar 20 13:44:06 crc kubenswrapper[4849]: W0320 13:44:06.061965 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9c58468_f82b_4d72_9a56_a9ebf39db54e.slice/crio-83811919dc87d8193c9fde9ce088b8ede425fe6b2afeca3404cbb27641a3aeae WatchSource:0}: Error finding container 83811919dc87d8193c9fde9ce088b8ede425fe6b2afeca3404cbb27641a3aeae: Status 404 returned error can't find the container with id 83811919dc87d8193c9fde9ce088b8ede425fe6b2afeca3404cbb27641a3aeae Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.142044 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8xlq\" (UniqueName: \"kubernetes.io/projected/3bec97e6-a0d7-484e-ba32-9469c22ff871-kube-api-access-j8xlq\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.259023 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9b12460-0ac3-446b-a794-6dd376201333","Type":"ContainerStarted","Data":"37b0ebf2d515f95379d1e03ed812cd0a4fd7e9e08ecddb531591e365c8451a31"} Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.262016 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" event={"ID":"3bec97e6-a0d7-484e-ba32-9469c22ff871","Type":"ContainerDied","Data":"d8136be18f3939e1eb064110cbb1552e264a81673baff7c8de444102067b1ff5"} Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.262041 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8136be18f3939e1eb064110cbb1552e264a81673baff7c8de444102067b1ff5" Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.262135 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-gd8zb" Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.284552 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76b4654f4f-2sft4" event={"ID":"d9c58468-f82b-4d72-9a56-a9ebf39db54e","Type":"ContainerStarted","Data":"83811919dc87d8193c9fde9ce088b8ede425fe6b2afeca3404cbb27641a3aeae"} Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.312635 4849 generic.go:334] "Generic (PLEG): container finished" podID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerID="de1ae04c0e9fd1ae4a4c5aa8908f7eb713f9ec7870dc01b9b4df79d4b841a788" exitCode=0 Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.313698 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" event={"ID":"deeae318-4f97-4be7-90dc-c63f22dcf3a6","Type":"ContainerDied","Data":"de1ae04c0e9fd1ae4a4c5aa8908f7eb713f9ec7870dc01b9b4df79d4b841a788"} Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.992205 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-2zntj"] Mar 20 13:44:06 crc kubenswrapper[4849]: I0320 13:44:06.999510 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-2zntj"] Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.049211 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c655a1-070d-4965-9949-6b3080d99104" path="/var/lib/kubelet/pods/72c655a1-070d-4965-9949-6b3080d99104/volumes" Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.049924 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d338730e-90ce-44bc-9b2d-b03b3c596cf8" path="/var/lib/kubelet/pods/d338730e-90ce-44bc-9b2d-b03b3c596cf8/volumes" Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.322122 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" event={"ID":"deeae318-4f97-4be7-90dc-c63f22dcf3a6","Type":"ContainerStarted","Data":"e9a47e91b926a8c035fd933101b4bef4dcde35f1ecbdfe2fa6c339495ec73e34"} Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.323071 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.327364 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9b12460-0ac3-446b-a794-6dd376201333","Type":"ContainerStarted","Data":"11ef6563f6bacb765c37b088386b372f149d82b24b0113c04008833fe01f51db"} Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.327615 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-log" containerID="cri-o://37b0ebf2d515f95379d1e03ed812cd0a4fd7e9e08ecddb531591e365c8451a31" gracePeriod=30 Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.327774 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-httpd" containerID="cri-o://11ef6563f6bacb765c37b088386b372f149d82b24b0113c04008833fe01f51db" gracePeriod=30 Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.329503 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce","Type":"ContainerStarted","Data":"ae7735acd6488ba0745dbb2807500bbe87e32bc4be8658d1491296254395a9bb"} Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.353039 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" podStartSLOduration=5.35302157 podStartE2EDuration="5.35302157s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:07.345081221 +0000 UTC m=+1197.022804636" watchObservedRunningTime="2026-03-20 13:44:07.35302157 +0000 UTC m=+1197.030744965" Mar 20 13:44:07 crc kubenswrapper[4849]: I0320 13:44:07.381105 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.381085634 podStartE2EDuration="5.381085634s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:07.375532727 +0000 UTC m=+1197.053256132" watchObservedRunningTime="2026-03-20 13:44:07.381085634 +0000 UTC m=+1197.058809019" Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.343956 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce","Type":"ContainerStarted","Data":"2e6205a0c9633020e3b7faedbdcde9891d0d3956638cb17ad46d663df50bd90d"} Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.344060 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-log" containerID="cri-o://ae7735acd6488ba0745dbb2807500bbe87e32bc4be8658d1491296254395a9bb" gracePeriod=30 Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.344135 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-httpd" containerID="cri-o://2e6205a0c9633020e3b7faedbdcde9891d0d3956638cb17ad46d663df50bd90d" gracePeriod=30 Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.383298 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.383283366 podStartE2EDuration="6.383283366s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:08.373457446 +0000 UTC m=+1198.051180841" watchObservedRunningTime="2026-03-20 13:44:08.383283366 +0000 UTC m=+1198.061006761" Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.413010 4849 generic.go:334] "Generic (PLEG): container finished" podID="a9b12460-0ac3-446b-a794-6dd376201333" containerID="11ef6563f6bacb765c37b088386b372f149d82b24b0113c04008833fe01f51db" exitCode=0 Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.413041 4849 generic.go:334] "Generic (PLEG): container finished" podID="a9b12460-0ac3-446b-a794-6dd376201333" containerID="37b0ebf2d515f95379d1e03ed812cd0a4fd7e9e08ecddb531591e365c8451a31" exitCode=143 Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.413094 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9b12460-0ac3-446b-a794-6dd376201333","Type":"ContainerDied","Data":"11ef6563f6bacb765c37b088386b372f149d82b24b0113c04008833fe01f51db"} Mar 20 13:44:08 crc kubenswrapper[4849]: I0320 13:44:08.413147 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9b12460-0ac3-446b-a794-6dd376201333","Type":"ContainerDied","Data":"37b0ebf2d515f95379d1e03ed812cd0a4fd7e9e08ecddb531591e365c8451a31"} Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.385478 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.386113 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.432981 4849 generic.go:334] "Generic (PLEG): container finished" podID="b2b9d8d1-1879-4871-85d9-2192274358ac" containerID="a19c96162d1f1d6aa4932ff050024bf930cd0e515ebb07895568780b8b23af2c" exitCode=0 Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.433071 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz947" event={"ID":"b2b9d8d1-1879-4871-85d9-2192274358ac","Type":"ContainerDied","Data":"a19c96162d1f1d6aa4932ff050024bf930cd0e515ebb07895568780b8b23af2c"} Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.437428 4849 generic.go:334] "Generic (PLEG): container finished" podID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerID="2e6205a0c9633020e3b7faedbdcde9891d0d3956638cb17ad46d663df50bd90d" exitCode=0 Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.437467 4849 generic.go:334] "Generic (PLEG): container finished" podID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerID="ae7735acd6488ba0745dbb2807500bbe87e32bc4be8658d1491296254395a9bb" exitCode=143 Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.437560 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce","Type":"ContainerDied","Data":"2e6205a0c9633020e3b7faedbdcde9891d0d3956638cb17ad46d663df50bd90d"} Mar 20 13:44:09 crc kubenswrapper[4849]: I0320 13:44:09.437614 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce","Type":"ContainerDied","Data":"ae7735acd6488ba0745dbb2807500bbe87e32bc4be8658d1491296254395a9bb"} Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.127011 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79d4788db5-tz9b5"] Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.213393 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68899bcb64-snjqk"] Mar 20 13:44:11 crc kubenswrapper[4849]: E0320 13:44:11.213796 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bec97e6-a0d7-484e-ba32-9469c22ff871" containerName="oc" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.213808 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bec97e6-a0d7-484e-ba32-9469c22ff871" containerName="oc" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.213996 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bec97e6-a0d7-484e-ba32-9469c22ff871" containerName="oc" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.214844 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.232769 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.262180 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68899bcb64-snjqk"] Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.266654 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-secret-key\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.266722 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-tls-certs\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.266789 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-combined-ca-bundle\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.283273 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/4623c171-dfb8-42e6-9038-a95ed2871b75-kube-api-access-8vzpx\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.283452 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-scripts\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.283487 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4623c171-dfb8-42e6-9038-a95ed2871b75-logs\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.283528 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-config-data\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.385752 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-combined-ca-bundle\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.385876 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/4623c171-dfb8-42e6-9038-a95ed2871b75-kube-api-access-8vzpx\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.385924 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-scripts\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.385940 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4623c171-dfb8-42e6-9038-a95ed2871b75-logs\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.385958 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-config-data\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.385995 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-secret-key\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.386010 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-tls-certs\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.393162 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4623c171-dfb8-42e6-9038-a95ed2871b75-logs\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.412192 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-config-data\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.444047 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-combined-ca-bundle\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.459205 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/4623c171-dfb8-42e6-9038-a95ed2871b75-kube-api-access-8vzpx\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.459508 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-scripts\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.463311 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-tls-certs\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.463394 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-secret-key\") pod \"horizon-68899bcb64-snjqk\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.527669 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76b4654f4f-2sft4"] Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.554443 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f784755c6-j267c"] Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.555869 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.569088 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.598602 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f784755c6-j267c"] Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.691920 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-horizon-tls-certs\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.691991 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/852cbb75-7003-4545-9b7b-b2eb83d269ac-scripts\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.692459 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852cbb75-7003-4545-9b7b-b2eb83d269ac-logs\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.692498 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwvcb\" (UniqueName: \"kubernetes.io/projected/852cbb75-7003-4545-9b7b-b2eb83d269ac-kube-api-access-mwvcb\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.692613 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-horizon-secret-key\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.692643 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-combined-ca-bundle\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.692898 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/852cbb75-7003-4545-9b7b-b2eb83d269ac-config-data\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.794753 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwvcb\" (UniqueName: \"kubernetes.io/projected/852cbb75-7003-4545-9b7b-b2eb83d269ac-kube-api-access-mwvcb\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.794809 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-horizon-secret-key\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.794842 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-combined-ca-bundle\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.795215 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/852cbb75-7003-4545-9b7b-b2eb83d269ac-config-data\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.795263 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-horizon-tls-certs\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.795294 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/852cbb75-7003-4545-9b7b-b2eb83d269ac-scripts\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.795324 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852cbb75-7003-4545-9b7b-b2eb83d269ac-logs\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.796075 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/852cbb75-7003-4545-9b7b-b2eb83d269ac-logs\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.796790 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/852cbb75-7003-4545-9b7b-b2eb83d269ac-scripts\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.797582 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/852cbb75-7003-4545-9b7b-b2eb83d269ac-config-data\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.798644 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-combined-ca-bundle\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.799470 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-horizon-secret-key\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.800996 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/852cbb75-7003-4545-9b7b-b2eb83d269ac-horizon-tls-certs\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.811101 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwvcb\" (UniqueName: \"kubernetes.io/projected/852cbb75-7003-4545-9b7b-b2eb83d269ac-kube-api-access-mwvcb\") pod \"horizon-7f784755c6-j267c\" (UID: \"852cbb75-7003-4545-9b7b-b2eb83d269ac\") " pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:11 crc kubenswrapper[4849]: I0320 13:44:11.884579 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:13 crc kubenswrapper[4849]: I0320 13:44:13.757898 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:13 crc kubenswrapper[4849]: I0320 13:44:13.811625 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kh6fq"] Mar 20 13:44:13 crc kubenswrapper[4849]: I0320 13:44:13.811870 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" containerID="cri-o://5343facdadaca1fcaa74c67f5e2a10a1588517a0bb5bf386aefc4fc9c867833b" gracePeriod=10 Mar 20 13:44:14 crc kubenswrapper[4849]: I0320 13:44:14.492157 4849 generic.go:334] "Generic (PLEG): container finished" podID="e17094e8-90c1-4262-abaf-99d824238711" containerID="5343facdadaca1fcaa74c67f5e2a10a1588517a0bb5bf386aefc4fc9c867833b" exitCode=0 Mar 20 13:44:14 crc kubenswrapper[4849]: I0320 13:44:14.492360 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" event={"ID":"e17094e8-90c1-4262-abaf-99d824238711","Type":"ContainerDied","Data":"5343facdadaca1fcaa74c67f5e2a10a1588517a0bb5bf386aefc4fc9c867833b"} Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.345264 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: connect: connection refused" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.486667 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.566754 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9b12460-0ac3-446b-a794-6dd376201333","Type":"ContainerDied","Data":"ca866ab1781b2a2609951a1ed02d98d20112885a1e28888a28f765cbae8c8be4"} Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.566847 4849 scope.go:117] "RemoveContainer" containerID="11ef6563f6bacb765c37b088386b372f149d82b24b0113c04008833fe01f51db" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.567072 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.607197 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-internal-tls-certs\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.607336 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-logs\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.607427 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.607508 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-config-data\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.607717 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-httpd-run\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.607800 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht5hs\" (UniqueName: \"kubernetes.io/projected/a9b12460-0ac3-446b-a794-6dd376201333-kube-api-access-ht5hs\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.608037 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-combined-ca-bundle\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.608157 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-scripts\") pod \"a9b12460-0ac3-446b-a794-6dd376201333\" (UID: \"a9b12460-0ac3-446b-a794-6dd376201333\") " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.610221 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.611054 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-logs" (OuterVolumeSpecName: "logs") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.617776 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.618770 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-scripts" (OuterVolumeSpecName: "scripts") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.628388 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b12460-0ac3-446b-a794-6dd376201333-kube-api-access-ht5hs" (OuterVolumeSpecName: "kube-api-access-ht5hs") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "kube-api-access-ht5hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.654814 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.691556 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-config-data" (OuterVolumeSpecName: "config-data") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.707000 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9b12460-0ac3-446b-a794-6dd376201333" (UID: "a9b12460-0ac3-446b-a794-6dd376201333"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.714936 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.714975 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.714986 4849 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.714999 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.715039 4849 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.715050 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9b12460-0ac3-446b-a794-6dd376201333-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.715060 4849 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9b12460-0ac3-446b-a794-6dd376201333-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.715071 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht5hs\" (UniqueName: \"kubernetes.io/projected/a9b12460-0ac3-446b-a794-6dd376201333-kube-api-access-ht5hs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.736415 4849 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.816705 4849 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.902129 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.916082 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.927638 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:17 crc kubenswrapper[4849]: E0320 13:44:17.928001 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-httpd" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.928018 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-httpd" Mar 20 13:44:17 crc kubenswrapper[4849]: E0320 13:44:17.928030 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-log" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.928036 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-log" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.928190 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-httpd" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.928206 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b12460-0ac3-446b-a794-6dd376201333" containerName="glance-log" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.929032 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.946967 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.947035 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:44:17 crc kubenswrapper[4849]: I0320 13:44:17.981283 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020140 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020281 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r5xl\" (UniqueName: \"kubernetes.io/projected/cba85559-136f-44e8-abc0-569ff409b64f-kube-api-access-4r5xl\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020348 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-logs\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020368 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020386 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020418 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020441 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.020484 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.121940 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-logs\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.122047 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.122074 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.122529 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-logs\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.123034 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.123067 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.123174 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.123298 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.123882 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r5xl\" (UniqueName: \"kubernetes.io/projected/cba85559-136f-44e8-abc0-569ff409b64f-kube-api-access-4r5xl\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.124596 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.126739 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.129602 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.136498 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.146318 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.146526 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.146929 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r5xl\" (UniqueName: \"kubernetes.io/projected/cba85559-136f-44e8-abc0-569ff409b64f-kube-api-access-4r5xl\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.175058 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:44:18 crc kubenswrapper[4849]: I0320 13:44:18.251106 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:19 crc kubenswrapper[4849]: I0320 13:44:19.047054 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b12460-0ac3-446b-a794-6dd376201333" path="/var/lib/kubelet/pods/a9b12460-0ac3-446b-a794-6dd376201333/volumes" Mar 20 13:44:22 crc kubenswrapper[4849]: E0320 13:44:22.145597 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 13:44:22 crc kubenswrapper[4849]: E0320 13:44:22.146234 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n586h64ch697h66fh669h7ch587hbh66ch648h66fhcdh569h7dh66bh58fh68dh88h55bh658h55bhfdh68dh7ch578h668h6fh54h566hch5ffh5b9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzpnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-76b4654f4f-2sft4_openstack(d9c58468-f82b-4d72-9a56-a9ebf39db54e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:44:22 crc kubenswrapper[4849]: E0320 13:44:22.148296 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-76b4654f4f-2sft4" podUID="d9c58468-f82b-4d72-9a56-a9ebf39db54e" Mar 20 13:44:22 crc kubenswrapper[4849]: E0320 13:44:22.156715 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 13:44:22 crc kubenswrapper[4849]: E0320 13:44:22.156913 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dch98hb4h76h5b5h5dfhd5h78hdhfbhc5h59fh669hd9h66dhfh5h684h57ch55fh8fh69hcbh6dh594hc4h5b4hfch555h697h9fh554q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtsdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-74457cc7cf-gpm8s_openstack(fd4ea7fa-1a83-4d99-a185-fa5a91636aa7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:44:22 crc kubenswrapper[4849]: E0320 13:44:22.159180 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-74457cc7cf-gpm8s" podUID="fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.261002 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.297765 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66xnv\" (UniqueName: \"kubernetes.io/projected/b2b9d8d1-1879-4871-85d9-2192274358ac-kube-api-access-66xnv\") pod \"b2b9d8d1-1879-4871-85d9-2192274358ac\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.297871 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-combined-ca-bundle\") pod \"b2b9d8d1-1879-4871-85d9-2192274358ac\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.297898 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-fernet-keys\") pod \"b2b9d8d1-1879-4871-85d9-2192274358ac\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.297949 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-config-data\") pod \"b2b9d8d1-1879-4871-85d9-2192274358ac\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.298046 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-scripts\") pod \"b2b9d8d1-1879-4871-85d9-2192274358ac\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.298134 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-credential-keys\") pod \"b2b9d8d1-1879-4871-85d9-2192274358ac\" (UID: \"b2b9d8d1-1879-4871-85d9-2192274358ac\") " Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.303842 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b2b9d8d1-1879-4871-85d9-2192274358ac" (UID: "b2b9d8d1-1879-4871-85d9-2192274358ac"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.305490 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-scripts" (OuterVolumeSpecName: "scripts") pod "b2b9d8d1-1879-4871-85d9-2192274358ac" (UID: "b2b9d8d1-1879-4871-85d9-2192274358ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.307631 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b2b9d8d1-1879-4871-85d9-2192274358ac" (UID: "b2b9d8d1-1879-4871-85d9-2192274358ac"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.315254 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b9d8d1-1879-4871-85d9-2192274358ac-kube-api-access-66xnv" (OuterVolumeSpecName: "kube-api-access-66xnv") pod "b2b9d8d1-1879-4871-85d9-2192274358ac" (UID: "b2b9d8d1-1879-4871-85d9-2192274358ac"). InnerVolumeSpecName "kube-api-access-66xnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.331208 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2b9d8d1-1879-4871-85d9-2192274358ac" (UID: "b2b9d8d1-1879-4871-85d9-2192274358ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.331685 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-config-data" (OuterVolumeSpecName: "config-data") pod "b2b9d8d1-1879-4871-85d9-2192274358ac" (UID: "b2b9d8d1-1879-4871-85d9-2192274358ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.399670 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66xnv\" (UniqueName: \"kubernetes.io/projected/b2b9d8d1-1879-4871-85d9-2192274358ac-kube-api-access-66xnv\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.399705 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.399715 4849 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.399724 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.399732 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.399739 4849 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b2b9d8d1-1879-4871-85d9-2192274358ac-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.610604 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gz947" event={"ID":"b2b9d8d1-1879-4871-85d9-2192274358ac","Type":"ContainerDied","Data":"66d5e6a8800575a9e8e35f23ba0eef5530bce9fffaef26bba4e56f1160033d45"} Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.610983 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d5e6a8800575a9e8e35f23ba0eef5530bce9fffaef26bba4e56f1160033d45" Mar 20 13:44:22 crc kubenswrapper[4849]: I0320 13:44:22.610897 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gz947" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.382323 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gz947"] Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.388739 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gz947"] Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.449545 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6qksz"] Mar 20 13:44:23 crc kubenswrapper[4849]: E0320 13:44:23.449941 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b9d8d1-1879-4871-85d9-2192274358ac" containerName="keystone-bootstrap" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.449959 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b9d8d1-1879-4871-85d9-2192274358ac" containerName="keystone-bootstrap" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.450130 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b9d8d1-1879-4871-85d9-2192274358ac" containerName="keystone-bootstrap" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.450628 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.453147 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.453215 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5zkb" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.453962 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.454177 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.454230 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.459897 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6qksz"] Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.521396 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-fernet-keys\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.521473 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-scripts\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.521490 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-config-data\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.521510 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-credential-keys\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.521551 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cghm\" (UniqueName: \"kubernetes.io/projected/6f18d572-488f-4e4e-9596-3b99b5298123-kube-api-access-2cghm\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.521578 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-combined-ca-bundle\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.623525 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-fernet-keys\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.623589 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-scripts\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.623613 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-config-data\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.623638 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-credential-keys\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.623658 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cghm\" (UniqueName: \"kubernetes.io/projected/6f18d572-488f-4e4e-9596-3b99b5298123-kube-api-access-2cghm\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.623685 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-combined-ca-bundle\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.629588 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-scripts\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.633741 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-fernet-keys\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.633777 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-credential-keys\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.634302 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-combined-ca-bundle\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.642206 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-config-data\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.642455 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cghm\" (UniqueName: \"kubernetes.io/projected/6f18d572-488f-4e4e-9596-3b99b5298123-kube-api-access-2cghm\") pod \"keystone-bootstrap-6qksz\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:23 crc kubenswrapper[4849]: I0320 13:44:23.777137 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:25 crc kubenswrapper[4849]: I0320 13:44:25.045602 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b9d8d1-1879-4871-85d9-2192274358ac" path="/var/lib/kubelet/pods/b2b9d8d1-1879-4871-85d9-2192274358ac/volumes" Mar 20 13:44:27 crc kubenswrapper[4849]: I0320 13:44:27.346083 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 13:44:28 crc kubenswrapper[4849]: I0320 13:44:28.665367 4849 generic.go:334] "Generic (PLEG): container finished" podID="c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" containerID="7f3cb2db57e1ff97b4333f399e9f9b2263113c1cda171a7408e89a733fef5fd3" exitCode=0 Mar 20 13:44:28 crc kubenswrapper[4849]: I0320 13:44:28.665467 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wh76b" event={"ID":"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520","Type":"ContainerDied","Data":"7f3cb2db57e1ff97b4333f399e9f9b2263113c1cda171a7408e89a733fef5fd3"} Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.868196 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.871948 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnf66\" (UniqueName: \"kubernetes.io/projected/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-kube-api-access-jnf66\") pod \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.876031 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-config\") pod \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.876472 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-combined-ca-bundle\") pod \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\" (UID: \"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520\") " Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.879370 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-kube-api-access-jnf66" (OuterVolumeSpecName: "kube-api-access-jnf66") pod "c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" (UID: "c2a5cf24-7a8d-40f9-87cc-0b9b6533e520"). InnerVolumeSpecName "kube-api-access-jnf66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.905445 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-config" (OuterVolumeSpecName: "config") pod "c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" (UID: "c2a5cf24-7a8d-40f9-87cc-0b9b6533e520"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.907157 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" (UID: "c2a5cf24-7a8d-40f9-87cc-0b9b6533e520"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.980131 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.980163 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:30 crc kubenswrapper[4849]: I0320 13:44:30.980174 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnf66\" (UniqueName: \"kubernetes.io/projected/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520-kube-api-access-jnf66\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:31 crc kubenswrapper[4849]: I0320 13:44:31.701000 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-wh76b" event={"ID":"c2a5cf24-7a8d-40f9-87cc-0b9b6533e520","Type":"ContainerDied","Data":"08a57355a02cf30937326c6d254846fd581ca4c68e7f05672142639968c2ecaf"} Mar 20 13:44:31 crc kubenswrapper[4849]: I0320 13:44:31.701480 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08a57355a02cf30937326c6d254846fd581ca4c68e7f05672142639968c2ecaf" Mar 20 13:44:31 crc kubenswrapper[4849]: I0320 13:44:31.701086 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-wh76b" Mar 20 13:44:31 crc kubenswrapper[4849]: E0320 13:44:31.974124 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 13:44:31 crc kubenswrapper[4849]: E0320 13:44:31.974268 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wchsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jk575_openstack(701dfbaa-ecac-4290-9402-90c866ccd108): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:44:31 crc kubenswrapper[4849]: E0320 13:44:31.976983 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jk575" podUID="701dfbaa-ecac-4290-9402-90c866ccd108" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.047020 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h6lmv"] Mar 20 13:44:32 crc kubenswrapper[4849]: E0320 13:44:32.047360 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" containerName="neutron-db-sync" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.047371 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" containerName="neutron-db-sync" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.047547 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" containerName="neutron-db-sync" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.048410 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.061249 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h6lmv"] Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.132469 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b4f644b5b-zlgdg"] Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.133883 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.138442 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8qbgb" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.138631 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.138741 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.138880 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.144771 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4f644b5b-zlgdg"] Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.203266 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-config\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.203359 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.203397 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.203477 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.203543 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq6p8\" (UniqueName: \"kubernetes.io/projected/effa400c-2afb-4abd-a644-6c15cdad3ee1-kube-api-access-gq6p8\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.203566 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305051 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-config\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305121 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-combined-ca-bundle\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305151 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305192 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-ovndb-tls-certs\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305224 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305248 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-httpd-config\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305278 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305316 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-config\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305362 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq6p8\" (UniqueName: \"kubernetes.io/projected/effa400c-2afb-4abd-a644-6c15cdad3ee1-kube-api-access-gq6p8\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305383 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.305398 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbv4s\" (UniqueName: \"kubernetes.io/projected/dfdf38eb-05df-43ec-acb7-258da19c432a-kube-api-access-bbv4s\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.306343 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-config\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.307039 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.307158 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.307380 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-svc\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.307447 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.347120 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.347831 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.354716 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq6p8\" (UniqueName: \"kubernetes.io/projected/effa400c-2afb-4abd-a644-6c15cdad3ee1-kube-api-access-gq6p8\") pod \"dnsmasq-dns-55f844cf75-h6lmv\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.377608 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.406850 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-combined-ca-bundle\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.406905 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-ovndb-tls-certs\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.406943 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-httpd-config\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.406995 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-config\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.407068 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbv4s\" (UniqueName: \"kubernetes.io/projected/dfdf38eb-05df-43ec-acb7-258da19c432a-kube-api-access-bbv4s\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.415123 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-httpd-config\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.420706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-combined-ca-bundle\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.425752 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbv4s\" (UniqueName: \"kubernetes.io/projected/dfdf38eb-05df-43ec-acb7-258da19c432a-kube-api-access-bbv4s\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.433404 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-ovndb-tls-certs\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.434921 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-config\") pod \"neutron-b4f644b5b-zlgdg\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.456261 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:32 crc kubenswrapper[4849]: E0320 13:44:32.712244 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jk575" podUID="701dfbaa-ecac-4290-9402-90c866ccd108" Mar 20 13:44:32 crc kubenswrapper[4849]: E0320 13:44:32.892247 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 13:44:32 crc kubenswrapper[4849]: E0320 13:44:32.892420 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xxcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gwq28_openstack(ee9399c2-4755-4acd-8514-7d49cdd92f16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:44:32 crc kubenswrapper[4849]: E0320 13:44:32.893578 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gwq28" podUID="ee9399c2-4755-4acd-8514-7d49cdd92f16" Mar 20 13:44:32 crc kubenswrapper[4849]: I0320 13:44:32.944330 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.127267 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-public-tls-certs\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.127334 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-config-data\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.127424 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmc7d\" (UniqueName: \"kubernetes.io/projected/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-kube-api-access-tmc7d\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.127472 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-scripts\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.127515 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-logs\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.128279 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-logs" (OuterVolumeSpecName: "logs") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.128695 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.128732 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-combined-ca-bundle\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.128757 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-httpd-run\") pod \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\" (UID: \"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.129237 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.129257 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.131076 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-scripts" (OuterVolumeSpecName: "scripts") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.131092 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-kube-api-access-tmc7d" (OuterVolumeSpecName: "kube-api-access-tmc7d") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "kube-api-access-tmc7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.132386 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.151964 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.169307 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.181680 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-config-data" (OuterVolumeSpecName: "config-data") pod "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" (UID: "ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.232935 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmc7d\" (UniqueName: \"kubernetes.io/projected/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-kube-api-access-tmc7d\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.233643 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.233739 4849 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.233762 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.233780 4849 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.233800 4849 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.233839 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.260468 4849 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.335152 4849 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.401277 4849 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.401410 4849 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68fh5c9h57bh74h66chbch96h66ch5c5h66h5d5h688h56ch54fh64ch645h565h597h5d6h556h96h5cdh6bh565hf4h5dfh8h5f8h588hfh8dhc8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fthp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(5e321362-ff76-4c26-bbde-9a97617ca460): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.403738 4849 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.460183 4849 scope.go:117] "RemoveContainer" containerID="37b0ebf2d515f95379d1e03ed812cd0a4fd7e9e08ecddb531591e365c8451a31" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.534008 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.552538 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.557433 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640086 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9c58468-f82b-4d72-9a56-a9ebf39db54e-horizon-secret-key\") pod \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640174 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c58468-f82b-4d72-9a56-a9ebf39db54e-logs\") pod \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640209 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-config-data\") pod \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640300 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzpnw\" (UniqueName: \"kubernetes.io/projected/d9c58468-f82b-4d72-9a56-a9ebf39db54e-kube-api-access-mzpnw\") pod \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640323 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-scripts\") pod \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\" (UID: \"d9c58468-f82b-4d72-9a56-a9ebf39db54e\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640486 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-nb\") pod \"e17094e8-90c1-4262-abaf-99d824238711\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640905 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-svc\") pod \"e17094e8-90c1-4262-abaf-99d824238711\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640953 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtsdp\" (UniqueName: \"kubernetes.io/projected/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-kube-api-access-gtsdp\") pod \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.640975 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-horizon-secret-key\") pod \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.643171 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c58468-f82b-4d72-9a56-a9ebf39db54e-logs" (OuterVolumeSpecName: "logs") pod "d9c58468-f82b-4d72-9a56-a9ebf39db54e" (UID: "d9c58468-f82b-4d72-9a56-a9ebf39db54e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.643212 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-scripts" (OuterVolumeSpecName: "scripts") pod "d9c58468-f82b-4d72-9a56-a9ebf39db54e" (UID: "d9c58468-f82b-4d72-9a56-a9ebf39db54e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.644869 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-config-data" (OuterVolumeSpecName: "config-data") pod "d9c58468-f82b-4d72-9a56-a9ebf39db54e" (UID: "d9c58468-f82b-4d72-9a56-a9ebf39db54e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.653302 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c58468-f82b-4d72-9a56-a9ebf39db54e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d9c58468-f82b-4d72-9a56-a9ebf39db54e" (UID: "d9c58468-f82b-4d72-9a56-a9ebf39db54e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.657510 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-kube-api-access-gtsdp" (OuterVolumeSpecName: "kube-api-access-gtsdp") pod "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" (UID: "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7"). InnerVolumeSpecName "kube-api-access-gtsdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.658098 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c58468-f82b-4d72-9a56-a9ebf39db54e-kube-api-access-mzpnw" (OuterVolumeSpecName: "kube-api-access-mzpnw") pod "d9c58468-f82b-4d72-9a56-a9ebf39db54e" (UID: "d9c58468-f82b-4d72-9a56-a9ebf39db54e"). InnerVolumeSpecName "kube-api-access-mzpnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.658514 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" (UID: "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.709949 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e17094e8-90c1-4262-abaf-99d824238711" (UID: "e17094e8-90c1-4262-abaf-99d824238711"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.723095 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e17094e8-90c1-4262-abaf-99d824238711" (UID: "e17094e8-90c1-4262-abaf-99d824238711"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.742979 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-config\") pod \"e17094e8-90c1-4262-abaf-99d824238711\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.743209 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-swift-storage-0\") pod \"e17094e8-90c1-4262-abaf-99d824238711\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.743250 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-sb\") pod \"e17094e8-90c1-4262-abaf-99d824238711\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.743290 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-config-data\") pod \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.743325 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvc5b\" (UniqueName: \"kubernetes.io/projected/e17094e8-90c1-4262-abaf-99d824238711-kube-api-access-xvc5b\") pod \"e17094e8-90c1-4262-abaf-99d824238711\" (UID: \"e17094e8-90c1-4262-abaf-99d824238711\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.743399 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-scripts\") pod \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.743669 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-logs\") pod \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\" (UID: \"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7\") " Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744227 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-logs" (OuterVolumeSpecName: "logs") pod "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" (UID: "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744292 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzpnw\" (UniqueName: \"kubernetes.io/projected/d9c58468-f82b-4d72-9a56-a9ebf39db54e-kube-api-access-mzpnw\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744319 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744334 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744359 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744371 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtsdp\" (UniqueName: \"kubernetes.io/projected/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-kube-api-access-gtsdp\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744388 4849 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744416 4849 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d9c58468-f82b-4d72-9a56-a9ebf39db54e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744428 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9c58468-f82b-4d72-9a56-a9ebf39db54e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744440 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d9c58468-f82b-4d72-9a56-a9ebf39db54e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.744522 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-scripts" (OuterVolumeSpecName: "scripts") pod "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" (UID: "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.745002 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-config-data" (OuterVolumeSpecName: "config-data") pod "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" (UID: "fd4ea7fa-1a83-4d99-a185-fa5a91636aa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.751026 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17094e8-90c1-4262-abaf-99d824238711-kube-api-access-xvc5b" (OuterVolumeSpecName: "kube-api-access-xvc5b") pod "e17094e8-90c1-4262-abaf-99d824238711" (UID: "e17094e8-90c1-4262-abaf-99d824238711"). InnerVolumeSpecName "kube-api-access-xvc5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.772133 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74457cc7cf-gpm8s" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.772205 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74457cc7cf-gpm8s" event={"ID":"fd4ea7fa-1a83-4d99-a185-fa5a91636aa7","Type":"ContainerDied","Data":"14f819c2bf87604def8b025a2a19a1c6482ca77f88cf1e7226fa5a16fd0add68"} Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.786838 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76b4654f4f-2sft4" event={"ID":"d9c58468-f82b-4d72-9a56-a9ebf39db54e","Type":"ContainerDied","Data":"83811919dc87d8193c9fde9ce088b8ede425fe6b2afeca3404cbb27641a3aeae"} Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.786942 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b4654f4f-2sft4" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.795476 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" event={"ID":"e17094e8-90c1-4262-abaf-99d824238711","Type":"ContainerDied","Data":"0c193d9aa43e6a460a9f1318f61b386afd9288d86e03df73f82d1e7e95137bcb"} Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.795528 4849 scope.go:117] "RemoveContainer" containerID="5343facdadaca1fcaa74c67f5e2a10a1588517a0bb5bf386aefc4fc9c867833b" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.795582 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.801698 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.801742 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce","Type":"ContainerDied","Data":"d3ddab40047a5a30291f96ae300e3f1976b6a1086baeff2596bebc16a3c3c206"} Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.802712 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-gwq28" podUID="ee9399c2-4755-4acd-8514-7d49cdd92f16" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.805429 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e17094e8-90c1-4262-abaf-99d824238711" (UID: "e17094e8-90c1-4262-abaf-99d824238711"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.814625 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e17094e8-90c1-4262-abaf-99d824238711" (UID: "e17094e8-90c1-4262-abaf-99d824238711"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.835780 4849 scope.go:117] "RemoveContainer" containerID="72d41a411d14a7db0d177f21a3ddd0abc333316cd838a70a608f93d8dd790124" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.840125 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-config" (OuterVolumeSpecName: "config") pod "e17094e8-90c1-4262-abaf-99d824238711" (UID: "e17094e8-90c1-4262-abaf-99d824238711"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851554 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851607 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851622 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851688 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvc5b\" (UniqueName: \"kubernetes.io/projected/e17094e8-90c1-4262-abaf-99d824238711-kube-api-access-xvc5b\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851703 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851719 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.851729 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17094e8-90c1-4262-abaf-99d824238711-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.882109 4849 scope.go:117] "RemoveContainer" containerID="2e6205a0c9633020e3b7faedbdcde9891d0d3956638cb17ad46d663df50bd90d" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.904143 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.946928 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.949273 4849 scope.go:117] "RemoveContainer" containerID="ae7735acd6488ba0745dbb2807500bbe87e32bc4be8658d1491296254395a9bb" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.959032 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.963979 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-log" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964290 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-log" Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.964307 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964314 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.964344 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-httpd" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964352 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-httpd" Mar 20 13:44:33 crc kubenswrapper[4849]: E0320 13:44:33.964376 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="init" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964383 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="init" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964603 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-log" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964619 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" containerName="glance-httpd" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.964631 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.965774 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.968441 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:44:33 crc kubenswrapper[4849]: I0320 13:44:33.970847 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.024356 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74457cc7cf-gpm8s"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.036903 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.053907 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.053957 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-scripts\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.053978 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-logs\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.054020 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhgk9\" (UniqueName: \"kubernetes.io/projected/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-kube-api-access-dhgk9\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.054123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.054166 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.054185 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.054252 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-config-data\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.060548 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74457cc7cf-gpm8s"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.073140 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f784755c6-j267c"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.122843 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76b4654f4f-2sft4"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.137711 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76b4654f4f-2sft4"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.157589 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79ff4b8df9-qp9mn"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.158627 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhgk9\" (UniqueName: \"kubernetes.io/projected/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-kube-api-access-dhgk9\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.159108 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.159861 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.159986 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.160015 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.160040 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-config-data\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.160136 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.160198 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-scripts\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.160225 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-logs\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.160953 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.161616 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.162832 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-logs\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.163272 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.163382 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.166589 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-scripts\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.171438 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.171514 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79ff4b8df9-qp9mn"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.176987 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhgk9\" (UniqueName: \"kubernetes.io/projected/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-kube-api-access-dhgk9\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.182626 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-config-data\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.188053 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.279314 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.280053 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-public-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.282969 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-internal-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.283046 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-httpd-config\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.284379 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-combined-ca-bundle\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.284519 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-config\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.284569 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2tq\" (UniqueName: \"kubernetes.io/projected/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-kube-api-access-kl2tq\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.284944 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-ovndb-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.291537 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6qksz"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.292455 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.304420 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68899bcb64-snjqk"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.365426 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.394660 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-ovndb-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.394856 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-public-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.394932 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-internal-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.394964 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-httpd-config\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.395032 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-combined-ca-bundle\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.395062 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-config\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.395097 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2tq\" (UniqueName: \"kubernetes.io/projected/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-kube-api-access-kl2tq\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.399663 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-public-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.400056 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-internal-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.400834 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-config\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.401598 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-combined-ca-bundle\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.401811 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-httpd-config\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.402485 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-ovndb-tls-certs\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.419638 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2tq\" (UniqueName: \"kubernetes.io/projected/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-kube-api-access-kl2tq\") pod \"neutron-79ff4b8df9-qp9mn\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.487486 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kh6fq"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.498248 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kh6fq"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.517121 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h6lmv"] Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.550033 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4f644b5b-zlgdg"] Mar 20 13:44:34 crc kubenswrapper[4849]: W0320 13:44:34.551799 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeffa400c_2afb_4abd_a644_6c15cdad3ee1.slice/crio-dcff2ab7e0c2bde67672d68ddf76362cc2f9ad08c2ffa831f21fc4fdd9dcb0cb WatchSource:0}: Error finding container dcff2ab7e0c2bde67672d68ddf76362cc2f9ad08c2ffa831f21fc4fdd9dcb0cb: Status 404 returned error can't find the container with id dcff2ab7e0c2bde67672d68ddf76362cc2f9ad08c2ffa831f21fc4fdd9dcb0cb Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.583377 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.871087 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f644b5b-zlgdg" event={"ID":"dfdf38eb-05df-43ec-acb7-258da19c432a","Type":"ContainerStarted","Data":"18fbcc631155497caad7a263000aeae1578eb13c7c25dc181f5b3a56c6d8efdb"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.876632 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba85559-136f-44e8-abc0-569ff409b64f","Type":"ContainerStarted","Data":"93830b1da5108fd4904dc53843c848128ccf7fb55b9ef327dfe1bf0992864a17"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.889332 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68899bcb64-snjqk" event={"ID":"4623c171-dfb8-42e6-9038-a95ed2871b75","Type":"ContainerStarted","Data":"0a9174601ae80fb33d84c0233114f71b6f537dcd6ab8a11c4d2eb6c6ce77db76"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.889393 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68899bcb64-snjqk" event={"ID":"4623c171-dfb8-42e6-9038-a95ed2871b75","Type":"ContainerStarted","Data":"44ee84ff30fb1326cc969d97bb705da62389e6f9763085b1f95fb6fd97bbf1e9"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.892671 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xbwzw" event={"ID":"39790e43-e227-4e13-8054-995e12255ec8","Type":"ContainerStarted","Data":"7ed8622f4b99236264791f2ed048fa77c650e00cce26a8711021fdbef54e5b37"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.912409 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" event={"ID":"effa400c-2afb-4abd-a644-6c15cdad3ee1","Type":"ContainerStarted","Data":"dcff2ab7e0c2bde67672d68ddf76362cc2f9ad08c2ffa831f21fc4fdd9dcb0cb"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.934507 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-xbwzw" podStartSLOduration=3.93175278 podStartE2EDuration="32.934482769s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="2026-03-20 13:44:04.486257707 +0000 UTC m=+1194.163981102" lastFinishedPulling="2026-03-20 13:44:33.488987696 +0000 UTC m=+1223.166711091" observedRunningTime="2026-03-20 13:44:34.915204228 +0000 UTC m=+1224.592927623" watchObservedRunningTime="2026-03-20 13:44:34.934482769 +0000 UTC m=+1224.612206164" Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.936091 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6qksz" event={"ID":"6f18d572-488f-4e4e-9596-3b99b5298123","Type":"ContainerStarted","Data":"80dd9311d2d3c9a0256f25fa30c71e71b906754fd19ca64fafe7cba22f93c606"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.936144 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6qksz" event={"ID":"6f18d572-488f-4e4e-9596-3b99b5298123","Type":"ContainerStarted","Data":"e4cdae3a8850cf2180d180ae6a12aafb6edda5e4d415c34c7e7cd93b91150af6"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.966971 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d4788db5-tz9b5" event={"ID":"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7","Type":"ContainerStarted","Data":"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.967021 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d4788db5-tz9b5" event={"ID":"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7","Type":"ContainerStarted","Data":"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.967169 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79d4788db5-tz9b5" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon-log" containerID="cri-o://60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7" gracePeriod=30 Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.967281 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-79d4788db5-tz9b5" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon" containerID="cri-o://edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88" gracePeriod=30 Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.987491 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f784755c6-j267c" event={"ID":"852cbb75-7003-4545-9b7b-b2eb83d269ac","Type":"ContainerStarted","Data":"78ac5577ab6183601b38ca732cf0d64dd7f17b2dd215842a5e809ddda4d99c1a"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.987533 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f784755c6-j267c" event={"ID":"852cbb75-7003-4545-9b7b-b2eb83d269ac","Type":"ContainerStarted","Data":"221f2e2fbc2ba377d0a8a514a63301a072aaa911711f9f3dc5680caef3bfcab2"} Mar 20 13:44:34 crc kubenswrapper[4849]: I0320 13:44:34.987549 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f784755c6-j267c" event={"ID":"852cbb75-7003-4545-9b7b-b2eb83d269ac","Type":"ContainerStarted","Data":"3461d972075842762ae0e8f1049256ae6e0f316b6711e02ff7f01e56471f5410"} Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.029625 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6qksz" podStartSLOduration=12.029607578 podStartE2EDuration="12.029607578s" podCreationTimestamp="2026-03-20 13:44:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:34.975771412 +0000 UTC m=+1224.653494807" watchObservedRunningTime="2026-03-20 13:44:35.029607578 +0000 UTC m=+1224.707330973" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.032454 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.072365 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-79d4788db5-tz9b5" podStartSLOduration=3.424755524 podStartE2EDuration="33.07234346s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="2026-03-20 13:44:03.751813327 +0000 UTC m=+1193.429536722" lastFinishedPulling="2026-03-20 13:44:33.399401263 +0000 UTC m=+1223.077124658" observedRunningTime="2026-03-20 13:44:35.013295476 +0000 UTC m=+1224.691018881" watchObservedRunningTime="2026-03-20 13:44:35.07234346 +0000 UTC m=+1224.750066855" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.081088 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7f784755c6-j267c" podStartSLOduration=24.081068061 podStartE2EDuration="24.081068061s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:35.054469327 +0000 UTC m=+1224.732192722" watchObservedRunningTime="2026-03-20 13:44:35.081068061 +0000 UTC m=+1224.758791456" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.081458 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce" path="/var/lib/kubelet/pods/ab9333e6-ac3b-4cfd-ab0a-5100cdb95dce/volumes" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.083004 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c58468-f82b-4d72-9a56-a9ebf39db54e" path="/var/lib/kubelet/pods/d9c58468-f82b-4d72-9a56-a9ebf39db54e/volumes" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.083628 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17094e8-90c1-4262-abaf-99d824238711" path="/var/lib/kubelet/pods/e17094e8-90c1-4262-abaf-99d824238711/volumes" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.088395 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4ea7fa-1a83-4d99-a185-fa5a91636aa7" path="/var/lib/kubelet/pods/fd4ea7fa-1a83-4d99-a185-fa5a91636aa7/volumes" Mar 20 13:44:35 crc kubenswrapper[4849]: I0320 13:44:35.330998 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79ff4b8df9-qp9mn"] Mar 20 13:44:35 crc kubenswrapper[4849]: W0320 13:44:35.514586 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fb2e9c5_af86_4c9c_8fcf_80a3bb92fda0.slice/crio-ef5820865243e2fcf2f6f35e7acb8c711ebba03fe0607c843dd4d9d257a16e1b WatchSource:0}: Error finding container ef5820865243e2fcf2f6f35e7acb8c711ebba03fe0607c843dd4d9d257a16e1b: Status 404 returned error can't find the container with id ef5820865243e2fcf2f6f35e7acb8c711ebba03fe0607c843dd4d9d257a16e1b Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.062537 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79ff4b8df9-qp9mn" event={"ID":"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d","Type":"ContainerStarted","Data":"1b39e59dfea417f45beb77beb7be33544ccf333b38fbba38f6ed15ea2d745a77"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.062878 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79ff4b8df9-qp9mn" event={"ID":"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d","Type":"ContainerStarted","Data":"8fa5940f1022dd7512b6037c04cf7f862d14246b995b2fb4036b53c38847bbdd"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.088037 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerStarted","Data":"1544e463846ecc6c58b261cb8a33d96be4cdf71b446787b0a0bf6a66673884ca"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.093691 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f644b5b-zlgdg" event={"ID":"dfdf38eb-05df-43ec-acb7-258da19c432a","Type":"ContainerStarted","Data":"4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.093765 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f644b5b-zlgdg" event={"ID":"dfdf38eb-05df-43ec-acb7-258da19c432a","Type":"ContainerStarted","Data":"227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.095305 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.099618 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba85559-136f-44e8-abc0-569ff409b64f","Type":"ContainerStarted","Data":"83d687f09c5f7f52eee6b78d59bb544c7b81984bbcadd67b79ad76c79bf77203"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.107940 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0","Type":"ContainerStarted","Data":"ef5820865243e2fcf2f6f35e7acb8c711ebba03fe0607c843dd4d9d257a16e1b"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.119740 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b4f644b5b-zlgdg" podStartSLOduration=4.119725838 podStartE2EDuration="4.119725838s" podCreationTimestamp="2026-03-20 13:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:36.117172951 +0000 UTC m=+1225.794896356" watchObservedRunningTime="2026-03-20 13:44:36.119725838 +0000 UTC m=+1225.797449233" Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.134006 4849 generic.go:334] "Generic (PLEG): container finished" podID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerID="a0eb1ae683075f418c02af4c3250e8b93290d3236fe805f6c5e6fdbee5c5a4c9" exitCode=0 Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.134232 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" event={"ID":"effa400c-2afb-4abd-a644-6c15cdad3ee1","Type":"ContainerDied","Data":"a0eb1ae683075f418c02af4c3250e8b93290d3236fe805f6c5e6fdbee5c5a4c9"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.145189 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68899bcb64-snjqk" event={"ID":"4623c171-dfb8-42e6-9038-a95ed2871b75","Type":"ContainerStarted","Data":"717f03af8e71b4627fc46deedfa0ff5f51c7251a3525a9f6fbfab7cf39df9d1d"} Mar 20 13:44:36 crc kubenswrapper[4849]: I0320 13:44:36.200049 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68899bcb64-snjqk" podStartSLOduration=25.200029485 podStartE2EDuration="25.200029485s" podCreationTimestamp="2026-03-20 13:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:36.195924636 +0000 UTC m=+1225.873648051" watchObservedRunningTime="2026-03-20 13:44:36.200029485 +0000 UTC m=+1225.877752880" Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.165999 4849 generic.go:334] "Generic (PLEG): container finished" podID="39790e43-e227-4e13-8054-995e12255ec8" containerID="7ed8622f4b99236264791f2ed048fa77c650e00cce26a8711021fdbef54e5b37" exitCode=0 Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.166194 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xbwzw" event={"ID":"39790e43-e227-4e13-8054-995e12255ec8","Type":"ContainerDied","Data":"7ed8622f4b99236264791f2ed048fa77c650e00cce26a8711021fdbef54e5b37"} Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.206784 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0","Type":"ContainerStarted","Data":"ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65"} Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.214591 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" event={"ID":"effa400c-2afb-4abd-a644-6c15cdad3ee1","Type":"ContainerStarted","Data":"0d89e398b4272f12dbec65de3b811005b6c969c4cdcee69ddc8c1ee9f61cccd5"} Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.215028 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.224184 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79ff4b8df9-qp9mn" event={"ID":"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d","Type":"ContainerStarted","Data":"941061793fb6895d944055b6d1b460415519760a1c867b677555e545de84da5d"} Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.224948 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.227916 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba85559-136f-44e8-abc0-569ff409b64f","Type":"ContainerStarted","Data":"5404262e6156bb6146e3f6d868367e7d68faec96efa2497a9920e6192b4a8e47"} Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.253530 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" podStartSLOduration=5.253485745 podStartE2EDuration="5.253485745s" podCreationTimestamp="2026-03-20 13:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:37.237106331 +0000 UTC m=+1226.914829746" watchObservedRunningTime="2026-03-20 13:44:37.253485745 +0000 UTC m=+1226.931209140" Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.293270 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=20.293251888 podStartE2EDuration="20.293251888s" podCreationTimestamp="2026-03-20 13:44:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:37.275528879 +0000 UTC m=+1226.953252304" watchObservedRunningTime="2026-03-20 13:44:37.293251888 +0000 UTC m=+1226.970975273" Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.296272 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79ff4b8df9-qp9mn" podStartSLOduration=3.296256668 podStartE2EDuration="3.296256668s" podCreationTimestamp="2026-03-20 13:44:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:37.289746966 +0000 UTC m=+1226.967470371" watchObservedRunningTime="2026-03-20 13:44:37.296256668 +0000 UTC m=+1226.973980063" Mar 20 13:44:37 crc kubenswrapper[4849]: I0320 13:44:37.348600 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kh6fq" podUID="e17094e8-90c1-4262-abaf-99d824238711" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.129:5353: i/o timeout" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.238197 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0","Type":"ContainerStarted","Data":"358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f"} Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.252524 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.252835 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.275176 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.275153413 podStartE2EDuration="5.275153413s" podCreationTimestamp="2026-03-20 13:44:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:38.260807653 +0000 UTC m=+1227.938531048" watchObservedRunningTime="2026-03-20 13:44:38.275153413 +0000 UTC m=+1227.952876808" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.314494 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.316133 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.822056 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.838454 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-scripts\") pod \"39790e43-e227-4e13-8054-995e12255ec8\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.839293 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-config-data\") pod \"39790e43-e227-4e13-8054-995e12255ec8\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.839345 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-combined-ca-bundle\") pod \"39790e43-e227-4e13-8054-995e12255ec8\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.839371 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39790e43-e227-4e13-8054-995e12255ec8-logs\") pod \"39790e43-e227-4e13-8054-995e12255ec8\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.839411 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f7qs\" (UniqueName: \"kubernetes.io/projected/39790e43-e227-4e13-8054-995e12255ec8-kube-api-access-7f7qs\") pod \"39790e43-e227-4e13-8054-995e12255ec8\" (UID: \"39790e43-e227-4e13-8054-995e12255ec8\") " Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.847830 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39790e43-e227-4e13-8054-995e12255ec8-logs" (OuterVolumeSpecName: "logs") pod "39790e43-e227-4e13-8054-995e12255ec8" (UID: "39790e43-e227-4e13-8054-995e12255ec8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.847957 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-scripts" (OuterVolumeSpecName: "scripts") pod "39790e43-e227-4e13-8054-995e12255ec8" (UID: "39790e43-e227-4e13-8054-995e12255ec8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.848085 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39790e43-e227-4e13-8054-995e12255ec8-kube-api-access-7f7qs" (OuterVolumeSpecName: "kube-api-access-7f7qs") pod "39790e43-e227-4e13-8054-995e12255ec8" (UID: "39790e43-e227-4e13-8054-995e12255ec8"). InnerVolumeSpecName "kube-api-access-7f7qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.875030 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-config-data" (OuterVolumeSpecName: "config-data") pod "39790e43-e227-4e13-8054-995e12255ec8" (UID: "39790e43-e227-4e13-8054-995e12255ec8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.876112 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39790e43-e227-4e13-8054-995e12255ec8" (UID: "39790e43-e227-4e13-8054-995e12255ec8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.944140 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.944179 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.944194 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39790e43-e227-4e13-8054-995e12255ec8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.944206 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39790e43-e227-4e13-8054-995e12255ec8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4849]: I0320 13:44:38.944219 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f7qs\" (UniqueName: \"kubernetes.io/projected/39790e43-e227-4e13-8054-995e12255ec8-kube-api-access-7f7qs\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.248175 4849 generic.go:334] "Generic (PLEG): container finished" podID="6f18d572-488f-4e4e-9596-3b99b5298123" containerID="80dd9311d2d3c9a0256f25fa30c71e71b906754fd19ca64fafe7cba22f93c606" exitCode=0 Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.248237 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6qksz" event={"ID":"6f18d572-488f-4e4e-9596-3b99b5298123","Type":"ContainerDied","Data":"80dd9311d2d3c9a0256f25fa30c71e71b906754fd19ca64fafe7cba22f93c606"} Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.252223 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-xbwzw" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.252629 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-xbwzw" event={"ID":"39790e43-e227-4e13-8054-995e12255ec8","Type":"ContainerDied","Data":"d9a9d571166c36fcb656e222c6eae2245c293836dbc168c8f5aa8a2f8dafaa40"} Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.252655 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a9d571166c36fcb656e222c6eae2245c293836dbc168c8f5aa8a2f8dafaa40" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.253125 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.253735 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.333025 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58468695c6-sg4wf"] Mar 20 13:44:39 crc kubenswrapper[4849]: E0320 13:44:39.333449 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39790e43-e227-4e13-8054-995e12255ec8" containerName="placement-db-sync" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.333465 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="39790e43-e227-4e13-8054-995e12255ec8" containerName="placement-db-sync" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.333718 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="39790e43-e227-4e13-8054-995e12255ec8" containerName="placement-db-sync" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.334766 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.339908 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.340063 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.340071 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.340217 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.340391 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rdl7q" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.346953 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58468695c6-sg4wf"] Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.387165 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.387216 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.387259 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.387901 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"320fbdc873fdc9693c329a47d54d9c46e735feb487e1c2d7c4da734e3de67821"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.387957 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://320fbdc873fdc9693c329a47d54d9c46e735feb487e1c2d7c4da734e3de67821" gracePeriod=600 Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.453963 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-internal-tls-certs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.454002 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpm8b\" (UniqueName: \"kubernetes.io/projected/f33c702a-869d-44ae-ab1c-a52d1bb71740-kube-api-access-wpm8b\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.454027 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-config-data\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.454191 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33c702a-869d-44ae-ab1c-a52d1bb71740-logs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.454214 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-public-tls-certs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.454235 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-scripts\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.454253 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-combined-ca-bundle\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.556833 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-scripts\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.557175 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-combined-ca-bundle\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.557238 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-internal-tls-certs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.557264 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpm8b\" (UniqueName: \"kubernetes.io/projected/f33c702a-869d-44ae-ab1c-a52d1bb71740-kube-api-access-wpm8b\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.557289 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-config-data\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.557458 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33c702a-869d-44ae-ab1c-a52d1bb71740-logs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.557485 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-public-tls-certs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.559707 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33c702a-869d-44ae-ab1c-a52d1bb71740-logs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.572415 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-scripts\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.572648 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-public-tls-certs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.581085 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-config-data\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.586223 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-combined-ca-bundle\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.587570 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpm8b\" (UniqueName: \"kubernetes.io/projected/f33c702a-869d-44ae-ab1c-a52d1bb71740-kube-api-access-wpm8b\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.587632 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-internal-tls-certs\") pod \"placement-58468695c6-sg4wf\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:39 crc kubenswrapper[4849]: I0320 13:44:39.675396 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.183264 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58468695c6-sg4wf"] Mar 20 13:44:40 crc kubenswrapper[4849]: W0320 13:44:40.219015 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf33c702a_869d_44ae_ab1c_a52d1bb71740.slice/crio-eae167f063b1c375d544fd3efd75f9c0bb0b42e752142c6228502d68f51cfaf2 WatchSource:0}: Error finding container eae167f063b1c375d544fd3efd75f9c0bb0b42e752142c6228502d68f51cfaf2: Status 404 returned error can't find the container with id eae167f063b1c375d544fd3efd75f9c0bb0b42e752142c6228502d68f51cfaf2 Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.278624 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="320fbdc873fdc9693c329a47d54d9c46e735feb487e1c2d7c4da734e3de67821" exitCode=0 Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.278951 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"320fbdc873fdc9693c329a47d54d9c46e735feb487e1c2d7c4da734e3de67821"} Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.278981 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"7a773bf9c49237a354678cd3d44df741a149b88bc3cf8989ae80a8e48fb75b7b"} Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.278997 4849 scope.go:117] "RemoveContainer" containerID="796d63258641a3af91f7958992403b9a5ad9b68fcc83db460f8e4cc151f123e0" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.281566 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58468695c6-sg4wf" event={"ID":"f33c702a-869d-44ae-ab1c-a52d1bb71740","Type":"ContainerStarted","Data":"eae167f063b1c375d544fd3efd75f9c0bb0b42e752142c6228502d68f51cfaf2"} Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.652259 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.785349 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-combined-ca-bundle\") pod \"6f18d572-488f-4e4e-9596-3b99b5298123\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.785396 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-config-data\") pod \"6f18d572-488f-4e4e-9596-3b99b5298123\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.785468 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-scripts\") pod \"6f18d572-488f-4e4e-9596-3b99b5298123\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.785583 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-credential-keys\") pod \"6f18d572-488f-4e4e-9596-3b99b5298123\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.785662 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-fernet-keys\") pod \"6f18d572-488f-4e4e-9596-3b99b5298123\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.785728 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cghm\" (UniqueName: \"kubernetes.io/projected/6f18d572-488f-4e4e-9596-3b99b5298123-kube-api-access-2cghm\") pod \"6f18d572-488f-4e4e-9596-3b99b5298123\" (UID: \"6f18d572-488f-4e4e-9596-3b99b5298123\") " Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.790679 4849 scope.go:117] "RemoveContainer" containerID="5184e57a6bad63b6b1b3dc7a9fcadb96b54de3bf36fd9b712578daac179fb823" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.807515 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f18d572-488f-4e4e-9596-3b99b5298123-kube-api-access-2cghm" (OuterVolumeSpecName: "kube-api-access-2cghm") pod "6f18d572-488f-4e4e-9596-3b99b5298123" (UID: "6f18d572-488f-4e4e-9596-3b99b5298123"). InnerVolumeSpecName "kube-api-access-2cghm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.808631 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6f18d572-488f-4e4e-9596-3b99b5298123" (UID: "6f18d572-488f-4e4e-9596-3b99b5298123"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.813109 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6f18d572-488f-4e4e-9596-3b99b5298123" (UID: "6f18d572-488f-4e4e-9596-3b99b5298123"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.813155 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-scripts" (OuterVolumeSpecName: "scripts") pod "6f18d572-488f-4e4e-9596-3b99b5298123" (UID: "6f18d572-488f-4e4e-9596-3b99b5298123"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.850736 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f18d572-488f-4e4e-9596-3b99b5298123" (UID: "6f18d572-488f-4e4e-9596-3b99b5298123"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.852904 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-config-data" (OuterVolumeSpecName: "config-data") pod "6f18d572-488f-4e4e-9596-3b99b5298123" (UID: "6f18d572-488f-4e4e-9596-3b99b5298123"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.887087 4849 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.887116 4849 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.887126 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cghm\" (UniqueName: \"kubernetes.io/projected/6f18d572-488f-4e4e-9596-3b99b5298123-kube-api-access-2cghm\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.887137 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.887145 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:40 crc kubenswrapper[4849]: I0320 13:44:40.887153 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f18d572-488f-4e4e-9596-3b99b5298123-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.306021 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6qksz" event={"ID":"6f18d572-488f-4e4e-9596-3b99b5298123","Type":"ContainerDied","Data":"e4cdae3a8850cf2180d180ae6a12aafb6edda5e4d415c34c7e7cd93b91150af6"} Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.306307 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4cdae3a8850cf2180d180ae6a12aafb6edda5e4d415c34c7e7cd93b91150af6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.306367 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6qksz" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.309890 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58468695c6-sg4wf" event={"ID":"f33c702a-869d-44ae-ab1c-a52d1bb71740","Type":"ContainerStarted","Data":"fdfe8719c7f110e47650840b9f6a310716efb6a3efe80bd4fdb7f954b6cd1cd0"} Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.309942 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58468695c6-sg4wf" event={"ID":"f33c702a-869d-44ae-ab1c-a52d1bb71740","Type":"ContainerStarted","Data":"53e6f807bee5ada97b5510e01bdf755c584777eef52bc7f8051198ab6aa5e603"} Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.310885 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.310908 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.398119 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58468695c6-sg4wf" podStartSLOduration=2.398099811 podStartE2EDuration="2.398099811s" podCreationTimestamp="2026-03-20 13:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:41.333435779 +0000 UTC m=+1231.011159194" watchObservedRunningTime="2026-03-20 13:44:41.398099811 +0000 UTC m=+1231.075823206" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.405199 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d5f885888-6vtg6"] Mar 20 13:44:41 crc kubenswrapper[4849]: E0320 13:44:41.405594 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f18d572-488f-4e4e-9596-3b99b5298123" containerName="keystone-bootstrap" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.405609 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f18d572-488f-4e4e-9596-3b99b5298123" containerName="keystone-bootstrap" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.405874 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f18d572-488f-4e4e-9596-3b99b5298123" containerName="keystone-bootstrap" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.407055 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.415343 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.415391 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.415411 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.415352 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.416212 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-k5zkb" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.418240 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d5f885888-6vtg6"] Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.419333 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502076 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-fernet-keys\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502135 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-internal-tls-certs\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502242 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-config-data\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502269 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8ml\" (UniqueName: \"kubernetes.io/projected/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-kube-api-access-2r8ml\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502293 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-scripts\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502566 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-public-tls-certs\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502669 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-combined-ca-bundle\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.502760 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-credential-keys\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.569193 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.569305 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604002 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-config-data\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604045 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r8ml\" (UniqueName: \"kubernetes.io/projected/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-kube-api-access-2r8ml\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604063 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-scripts\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604125 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-public-tls-certs\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604149 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-combined-ca-bundle\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604179 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-credential-keys\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604214 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-fernet-keys\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.604232 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-internal-tls-certs\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.611317 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-internal-tls-certs\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.611508 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-public-tls-certs\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.614555 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-config-data\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.616153 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-credential-keys\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.619295 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-fernet-keys\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.620138 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-scripts\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.620635 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-combined-ca-bundle\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.632344 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r8ml\" (UniqueName: \"kubernetes.io/projected/f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9-kube-api-access-2r8ml\") pod \"keystone-7d5f885888-6vtg6\" (UID: \"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9\") " pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.732188 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.887128 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:41 crc kubenswrapper[4849]: I0320 13:44:41.888173 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:44:42 crc kubenswrapper[4849]: I0320 13:44:42.399930 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:44:42 crc kubenswrapper[4849]: I0320 13:44:42.412859 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d5f885888-6vtg6"] Mar 20 13:44:42 crc kubenswrapper[4849]: I0320 13:44:42.474228 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lftjx"] Mar 20 13:44:42 crc kubenswrapper[4849]: I0320 13:44:42.474761 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="dnsmasq-dns" containerID="cri-o://e9a47e91b926a8c035fd933101b4bef4dcde35f1ecbdfe2fa6c339495ec73e34" gracePeriod=10 Mar 20 13:44:42 crc kubenswrapper[4849]: I0320 13:44:42.706916 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:44:43 crc kubenswrapper[4849]: I0320 13:44:43.305706 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:43 crc kubenswrapper[4849]: I0320 13:44:43.353408 4849 generic.go:334] "Generic (PLEG): container finished" podID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerID="e9a47e91b926a8c035fd933101b4bef4dcde35f1ecbdfe2fa6c339495ec73e34" exitCode=0 Mar 20 13:44:43 crc kubenswrapper[4849]: I0320 13:44:43.353677 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" event={"ID":"deeae318-4f97-4be7-90dc-c63f22dcf3a6","Type":"ContainerDied","Data":"e9a47e91b926a8c035fd933101b4bef4dcde35f1ecbdfe2fa6c339495ec73e34"} Mar 20 13:44:43 crc kubenswrapper[4849]: I0320 13:44:43.757756 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.122844 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.293791 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.293853 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.359457 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.359771 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.370956 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:44:44 crc kubenswrapper[4849]: I0320 13:44:44.371004 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:44:46 crc kubenswrapper[4849]: I0320 13:44:46.475046 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:44:46 crc kubenswrapper[4849]: I0320 13:44:46.475465 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:44:46 crc kubenswrapper[4849]: I0320 13:44:46.476656 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:44:48 crc kubenswrapper[4849]: W0320 13:44:48.582169 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f4b109_3301_4c4d_9f8f_6fc3fe7b41e9.slice/crio-69c5c1c94d4a58d14b5d7d13ed2b185efd17ccddb61225993403cf367ef67d9b WatchSource:0}: Error finding container 69c5c1c94d4a58d14b5d7d13ed2b185efd17ccddb61225993403cf367ef67d9b: Status 404 returned error can't find the container with id 69c5c1c94d4a58d14b5d7d13ed2b185efd17ccddb61225993403cf367ef67d9b Mar 20 13:44:48 crc kubenswrapper[4849]: I0320 13:44:48.947776 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.010478 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-config\") pod \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.010588 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-sb\") pod \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.010645 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-svc\") pod \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.010674 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-nb\") pod \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.010743 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-swift-storage-0\") pod \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.010804 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttq4h\" (UniqueName: \"kubernetes.io/projected/deeae318-4f97-4be7-90dc-c63f22dcf3a6-kube-api-access-ttq4h\") pod \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\" (UID: \"deeae318-4f97-4be7-90dc-c63f22dcf3a6\") " Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.015413 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deeae318-4f97-4be7-90dc-c63f22dcf3a6-kube-api-access-ttq4h" (OuterVolumeSpecName: "kube-api-access-ttq4h") pod "deeae318-4f97-4be7-90dc-c63f22dcf3a6" (UID: "deeae318-4f97-4be7-90dc-c63f22dcf3a6"). InnerVolumeSpecName "kube-api-access-ttq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.111455 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "deeae318-4f97-4be7-90dc-c63f22dcf3a6" (UID: "deeae318-4f97-4be7-90dc-c63f22dcf3a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.111678 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "deeae318-4f97-4be7-90dc-c63f22dcf3a6" (UID: "deeae318-4f97-4be7-90dc-c63f22dcf3a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.112965 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "deeae318-4f97-4be7-90dc-c63f22dcf3a6" (UID: "deeae318-4f97-4be7-90dc-c63f22dcf3a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.112996 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttq4h\" (UniqueName: \"kubernetes.io/projected/deeae318-4f97-4be7-90dc-c63f22dcf3a6-kube-api-access-ttq4h\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.113054 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.113071 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.117506 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-config" (OuterVolumeSpecName: "config") pod "deeae318-4f97-4be7-90dc-c63f22dcf3a6" (UID: "deeae318-4f97-4be7-90dc-c63f22dcf3a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.145353 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "deeae318-4f97-4be7-90dc-c63f22dcf3a6" (UID: "deeae318-4f97-4be7-90dc-c63f22dcf3a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.214326 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.214355 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.214364 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/deeae318-4f97-4be7-90dc-c63f22dcf3a6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.426262 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerStarted","Data":"40c515705c039c38d9fd6e728051db6c39e1c019d97d1246f992369906bd4ac0"} Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.427980 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwq28" event={"ID":"ee9399c2-4755-4acd-8514-7d49cdd92f16","Type":"ContainerStarted","Data":"f0f839010b6717e6ff88a8ac36737355463b9967cfada5d839c52e8f21a81747"} Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.446091 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" event={"ID":"deeae318-4f97-4be7-90dc-c63f22dcf3a6","Type":"ContainerDied","Data":"2322908ae870a1181a7a176015c1e34cf2b59f39ecc865b7618029843a2d2d73"} Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.446145 4849 scope.go:117] "RemoveContainer" containerID="e9a47e91b926a8c035fd933101b4bef4dcde35f1ecbdfe2fa6c339495ec73e34" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.446294 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.455457 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gwq28" podStartSLOduration=2.884844243 podStartE2EDuration="47.455438982s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="2026-03-20 13:44:04.162555164 +0000 UTC m=+1193.840278559" lastFinishedPulling="2026-03-20 13:44:48.733149903 +0000 UTC m=+1238.410873298" observedRunningTime="2026-03-20 13:44:49.454294182 +0000 UTC m=+1239.132017577" watchObservedRunningTime="2026-03-20 13:44:49.455438982 +0000 UTC m=+1239.133162377" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.460992 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d5f885888-6vtg6" event={"ID":"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9","Type":"ContainerStarted","Data":"ca40a59e551ab094d704e2f13b707ceb008755f7fd48fdb7226d9d65db3583d1"} Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.461026 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d5f885888-6vtg6" event={"ID":"f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9","Type":"ContainerStarted","Data":"69c5c1c94d4a58d14b5d7d13ed2b185efd17ccddb61225993403cf367ef67d9b"} Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.463934 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.494636 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d5f885888-6vtg6" podStartSLOduration=8.49461962 podStartE2EDuration="8.49461962s" podCreationTimestamp="2026-03-20 13:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:49.483566777 +0000 UTC m=+1239.161290202" watchObservedRunningTime="2026-03-20 13:44:49.49461962 +0000 UTC m=+1239.172343015" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.504911 4849 scope.go:117] "RemoveContainer" containerID="de1ae04c0e9fd1ae4a4c5aa8908f7eb713f9ec7870dc01b9b4df79d4b841a788" Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.512227 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lftjx"] Mar 20 13:44:49 crc kubenswrapper[4849]: I0320 13:44:49.522215 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-lftjx"] Mar 20 13:44:50 crc kubenswrapper[4849]: I0320 13:44:50.472196 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jk575" event={"ID":"701dfbaa-ecac-4290-9402-90c866ccd108","Type":"ContainerStarted","Data":"ff1b1f41e9211f3980105e4e2ceb94ed7dbd5707e99659148e01c814de2c1342"} Mar 20 13:44:50 crc kubenswrapper[4849]: I0320 13:44:50.502940 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jk575" podStartSLOduration=3.960798618 podStartE2EDuration="48.502917543s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="2026-03-20 13:44:04.192875537 +0000 UTC m=+1193.870598932" lastFinishedPulling="2026-03-20 13:44:48.734994462 +0000 UTC m=+1238.412717857" observedRunningTime="2026-03-20 13:44:50.494307545 +0000 UTC m=+1240.172030970" watchObservedRunningTime="2026-03-20 13:44:50.502917543 +0000 UTC m=+1240.180640948" Mar 20 13:44:51 crc kubenswrapper[4849]: I0320 13:44:51.061301 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" path="/var/lib/kubelet/pods/deeae318-4f97-4be7-90dc-c63f22dcf3a6/volumes" Mar 20 13:44:51 crc kubenswrapper[4849]: I0320 13:44:51.487851 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee9399c2-4755-4acd-8514-7d49cdd92f16" containerID="f0f839010b6717e6ff88a8ac36737355463b9967cfada5d839c52e8f21a81747" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4849]: I0320 13:44:51.487939 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwq28" event={"ID":"ee9399c2-4755-4acd-8514-7d49cdd92f16","Type":"ContainerDied","Data":"f0f839010b6717e6ff88a8ac36737355463b9967cfada5d839c52e8f21a81747"} Mar 20 13:44:51 crc kubenswrapper[4849]: I0320 13:44:51.574030 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68899bcb64-snjqk" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:44:51 crc kubenswrapper[4849]: I0320 13:44:51.892175 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7f784755c6-j267c" podUID="852cbb75-7003-4545-9b7b-b2eb83d269ac" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.879171 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.894140 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-db-sync-config-data\") pod \"ee9399c2-4755-4acd-8514-7d49cdd92f16\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.894428 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-combined-ca-bundle\") pod \"ee9399c2-4755-4acd-8514-7d49cdd92f16\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.894493 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xxcp\" (UniqueName: \"kubernetes.io/projected/ee9399c2-4755-4acd-8514-7d49cdd92f16-kube-api-access-7xxcp\") pod \"ee9399c2-4755-4acd-8514-7d49cdd92f16\" (UID: \"ee9399c2-4755-4acd-8514-7d49cdd92f16\") " Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.921298 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ee9399c2-4755-4acd-8514-7d49cdd92f16" (UID: "ee9399c2-4755-4acd-8514-7d49cdd92f16"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.926040 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee9399c2-4755-4acd-8514-7d49cdd92f16" (UID: "ee9399c2-4755-4acd-8514-7d49cdd92f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.927261 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9399c2-4755-4acd-8514-7d49cdd92f16-kube-api-access-7xxcp" (OuterVolumeSpecName: "kube-api-access-7xxcp") pod "ee9399c2-4755-4acd-8514-7d49cdd92f16" (UID: "ee9399c2-4755-4acd-8514-7d49cdd92f16"). InnerVolumeSpecName "kube-api-access-7xxcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.998222 4849 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.999019 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee9399c2-4755-4acd-8514-7d49cdd92f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4849]: I0320 13:44:52.999104 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xxcp\" (UniqueName: \"kubernetes.io/projected/ee9399c2-4755-4acd-8514-7d49cdd92f16-kube-api-access-7xxcp\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.510093 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwq28" event={"ID":"ee9399c2-4755-4acd-8514-7d49cdd92f16","Type":"ContainerDied","Data":"75cc55c6f8b591eb4808ebe6dd3d5b4d3f3cf28a40b876aa3e18f863ab38e4b8"} Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.510342 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75cc55c6f8b591eb4808ebe6dd3d5b4d3f3cf28a40b876aa3e18f863ab38e4b8" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.510259 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwq28" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.763949 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-lftjx" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.764797 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-845957dc9-clhc5"] Mar 20 13:44:53 crc kubenswrapper[4849]: E0320 13:44:53.765153 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9399c2-4755-4acd-8514-7d49cdd92f16" containerName="barbican-db-sync" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.765165 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9399c2-4755-4acd-8514-7d49cdd92f16" containerName="barbican-db-sync" Mar 20 13:44:53 crc kubenswrapper[4849]: E0320 13:44:53.765179 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="dnsmasq-dns" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.765185 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="dnsmasq-dns" Mar 20 13:44:53 crc kubenswrapper[4849]: E0320 13:44:53.765215 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="init" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.765221 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="init" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.765392 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9399c2-4755-4acd-8514-7d49cdd92f16" containerName="barbican-db-sync" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.765404 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="deeae318-4f97-4be7-90dc-c63f22dcf3a6" containerName="dnsmasq-dns" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.766246 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.769292 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.769440 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.775617 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-845957dc9-clhc5"] Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.775897 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-44cqd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.810038 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f759b9866-rl5dd"] Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.813413 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.816016 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.817570 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zk5c\" (UniqueName: \"kubernetes.io/projected/f29aa501-5db8-44ee-b155-a2ffe7b521bc-kube-api-access-2zk5c\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.817728 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-config-data\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.817807 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-config-data-custom\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.817850 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29aa501-5db8-44ee-b155-a2ffe7b521bc-logs\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.817974 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-combined-ca-bundle\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.874232 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f759b9866-rl5dd"] Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.887684 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-jtrvf"] Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.889207 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919713 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919775 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-svc\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919807 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919837 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919868 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zk5c\" (UniqueName: \"kubernetes.io/projected/f29aa501-5db8-44ee-b155-a2ffe7b521bc-kube-api-access-2zk5c\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919909 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-config-data-custom\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919935 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-config-data\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919960 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-config-data-custom\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919979 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-config-data\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.919998 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29aa501-5db8-44ee-b155-a2ffe7b521bc-logs\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.920017 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f30102de-0f18-4a4a-80e6-58d2de7c230d-logs\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.920038 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb4l5\" (UniqueName: \"kubernetes.io/projected/f30102de-0f18-4a4a-80e6-58d2de7c230d-kube-api-access-bb4l5\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.920058 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-combined-ca-bundle\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.920098 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-combined-ca-bundle\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.920134 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-config\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.920171 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dplm5\" (UniqueName: \"kubernetes.io/projected/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-kube-api-access-dplm5\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.921549 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f29aa501-5db8-44ee-b155-a2ffe7b521bc-logs\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.940018 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zk5c\" (UniqueName: \"kubernetes.io/projected/f29aa501-5db8-44ee-b155-a2ffe7b521bc-kube-api-access-2zk5c\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.943866 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-jtrvf"] Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.944538 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-config-data-custom\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.946180 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-config-data\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:53 crc kubenswrapper[4849]: I0320 13:44:53.947365 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29aa501-5db8-44ee-b155-a2ffe7b521bc-combined-ca-bundle\") pod \"barbican-worker-845957dc9-clhc5\" (UID: \"f29aa501-5db8-44ee-b155-a2ffe7b521bc\") " pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.019855 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57974449fd-mzjh9"] Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.021469 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023162 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dplm5\" (UniqueName: \"kubernetes.io/projected/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-kube-api-access-dplm5\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023197 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023236 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-svc\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023258 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023279 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023326 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-config-data-custom\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023355 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-config-data\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023375 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f30102de-0f18-4a4a-80e6-58d2de7c230d-logs\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023395 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb4l5\" (UniqueName: \"kubernetes.io/projected/f30102de-0f18-4a4a-80e6-58d2de7c230d-kube-api-access-bb4l5\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023414 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-combined-ca-bundle\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.023458 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-config\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.024658 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f30102de-0f18-4a4a-80e6-58d2de7c230d-logs\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.024750 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.024999 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-svc\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.025342 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.025645 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.025871 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.032737 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-combined-ca-bundle\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.034452 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-config-data-custom\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.035265 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-config\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.035551 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f30102de-0f18-4a4a-80e6-58d2de7c230d-config-data\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.042153 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb4l5\" (UniqueName: \"kubernetes.io/projected/f30102de-0f18-4a4a-80e6-58d2de7c230d-kube-api-access-bb4l5\") pod \"barbican-keystone-listener-6f759b9866-rl5dd\" (UID: \"f30102de-0f18-4a4a-80e6-58d2de7c230d\") " pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.043788 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57974449fd-mzjh9"] Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.054687 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dplm5\" (UniqueName: \"kubernetes.io/projected/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-kube-api-access-dplm5\") pod \"dnsmasq-dns-85ff748b95-jtrvf\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.101943 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-845957dc9-clhc5" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.131288 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data-custom\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.131363 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-combined-ca-bundle\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.131404 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.131448 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ca49bbd-dd90-4eee-bd44-59934f7d757e-logs\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.131484 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwnv9\" (UniqueName: \"kubernetes.io/projected/5ca49bbd-dd90-4eee-bd44-59934f7d757e-kube-api-access-bwnv9\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.148410 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.226443 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.233948 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-combined-ca-bundle\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.234005 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.234049 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ca49bbd-dd90-4eee-bd44-59934f7d757e-logs\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.234081 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwnv9\" (UniqueName: \"kubernetes.io/projected/5ca49bbd-dd90-4eee-bd44-59934f7d757e-kube-api-access-bwnv9\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.234176 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data-custom\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.234625 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ca49bbd-dd90-4eee-bd44-59934f7d757e-logs\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.239742 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data-custom\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.243225 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.244124 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-combined-ca-bundle\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.264481 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwnv9\" (UniqueName: \"kubernetes.io/projected/5ca49bbd-dd90-4eee-bd44-59934f7d757e-kube-api-access-bwnv9\") pod \"barbican-api-57974449fd-mzjh9\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:54 crc kubenswrapper[4849]: I0320 13:44:54.458582 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:44:55 crc kubenswrapper[4849]: I0320 13:44:55.535915 4849 generic.go:334] "Generic (PLEG): container finished" podID="701dfbaa-ecac-4290-9402-90c866ccd108" containerID="ff1b1f41e9211f3980105e4e2ceb94ed7dbd5707e99659148e01c814de2c1342" exitCode=0 Mar 20 13:44:55 crc kubenswrapper[4849]: I0320 13:44:55.535960 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jk575" event={"ID":"701dfbaa-ecac-4290-9402-90c866ccd108","Type":"ContainerDied","Data":"ff1b1f41e9211f3980105e4e2ceb94ed7dbd5707e99659148e01c814de2c1342"} Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.291003 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68775b9c9d-w9j9w"] Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.294563 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.296409 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.296724 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.304321 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68775b9c9d-w9j9w"] Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn2pk\" (UniqueName: \"kubernetes.io/projected/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-kube-api-access-hn2pk\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398174 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-config-data-custom\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398203 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-config-data\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398222 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-combined-ca-bundle\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398259 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-logs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398334 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-internal-tls-certs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.398390 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-public-tls-certs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500141 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-config-data-custom\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500189 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-config-data\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500208 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-combined-ca-bundle\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500245 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-logs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500317 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-internal-tls-certs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500356 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-public-tls-certs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500381 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn2pk\" (UniqueName: \"kubernetes.io/projected/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-kube-api-access-hn2pk\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.500794 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-logs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.505775 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-config-data-custom\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.507600 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-public-tls-certs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.507880 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-internal-tls-certs\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.508087 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-combined-ca-bundle\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.508332 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-config-data\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.523739 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn2pk\" (UniqueName: \"kubernetes.io/projected/cbe1119f-65b7-4aea-a636-1d745ea8e3b6-kube-api-access-hn2pk\") pod \"barbican-api-68775b9c9d-w9j9w\" (UID: \"cbe1119f-65b7-4aea-a636-1d745ea8e3b6\") " pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:56 crc kubenswrapper[4849]: I0320 13:44:56.619314 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.062798 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109185 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wchsg\" (UniqueName: \"kubernetes.io/projected/701dfbaa-ecac-4290-9402-90c866ccd108-kube-api-access-wchsg\") pod \"701dfbaa-ecac-4290-9402-90c866ccd108\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109231 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-scripts\") pod \"701dfbaa-ecac-4290-9402-90c866ccd108\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109271 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-db-sync-config-data\") pod \"701dfbaa-ecac-4290-9402-90c866ccd108\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109380 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/701dfbaa-ecac-4290-9402-90c866ccd108-etc-machine-id\") pod \"701dfbaa-ecac-4290-9402-90c866ccd108\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109420 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-config-data\") pod \"701dfbaa-ecac-4290-9402-90c866ccd108\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109441 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-combined-ca-bundle\") pod \"701dfbaa-ecac-4290-9402-90c866ccd108\" (UID: \"701dfbaa-ecac-4290-9402-90c866ccd108\") " Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109538 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/701dfbaa-ecac-4290-9402-90c866ccd108-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "701dfbaa-ecac-4290-9402-90c866ccd108" (UID: "701dfbaa-ecac-4290-9402-90c866ccd108"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.109904 4849 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/701dfbaa-ecac-4290-9402-90c866ccd108-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.124460 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-scripts" (OuterVolumeSpecName: "scripts") pod "701dfbaa-ecac-4290-9402-90c866ccd108" (UID: "701dfbaa-ecac-4290-9402-90c866ccd108"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.126283 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "701dfbaa-ecac-4290-9402-90c866ccd108" (UID: "701dfbaa-ecac-4290-9402-90c866ccd108"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.127495 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701dfbaa-ecac-4290-9402-90c866ccd108-kube-api-access-wchsg" (OuterVolumeSpecName: "kube-api-access-wchsg") pod "701dfbaa-ecac-4290-9402-90c866ccd108" (UID: "701dfbaa-ecac-4290-9402-90c866ccd108"). InnerVolumeSpecName "kube-api-access-wchsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.142613 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701dfbaa-ecac-4290-9402-90c866ccd108" (UID: "701dfbaa-ecac-4290-9402-90c866ccd108"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.182943 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-config-data" (OuterVolumeSpecName: "config-data") pod "701dfbaa-ecac-4290-9402-90c866ccd108" (UID: "701dfbaa-ecac-4290-9402-90c866ccd108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.211284 4849 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.211344 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.211354 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.211363 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wchsg\" (UniqueName: \"kubernetes.io/projected/701dfbaa-ecac-4290-9402-90c866ccd108-kube-api-access-wchsg\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.211372 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701dfbaa-ecac-4290-9402-90c866ccd108-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.557977 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jk575" event={"ID":"701dfbaa-ecac-4290-9402-90c866ccd108","Type":"ContainerDied","Data":"71fb2390e3c750900edccd8105a8978199d55cf1005899cec2eb35f52717c3cb"} Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.558107 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71fb2390e3c750900edccd8105a8978199d55cf1005899cec2eb35f52717c3cb" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.558068 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jk575" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.776111 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:44:57 crc kubenswrapper[4849]: E0320 13:44:57.776687 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701dfbaa-ecac-4290-9402-90c866ccd108" containerName="cinder-db-sync" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.776699 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="701dfbaa-ecac-4290-9402-90c866ccd108" containerName="cinder-db-sync" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.776890 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="701dfbaa-ecac-4290-9402-90c866ccd108" containerName="cinder-db-sync" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.777705 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.779967 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.780128 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.780161 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.781516 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-plg4l" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.818017 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.822233 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.822315 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.822383 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f41b707-ab00-42a2-9472-cd761733addc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.822498 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-scripts\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.822521 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2z6m\" (UniqueName: \"kubernetes.io/projected/0f41b707-ab00-42a2-9472-cd761733addc-kube-api-access-g2z6m\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.822599 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.866429 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-jtrvf"] Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.879386 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-l8pb4"] Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.899217 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.928722 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-scripts\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.928765 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2z6m\" (UniqueName: \"kubernetes.io/projected/0f41b707-ab00-42a2-9472-cd761733addc-kube-api-access-g2z6m\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.928837 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.928881 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.928922 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.928939 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f41b707-ab00-42a2-9472-cd761733addc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.929972 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f41b707-ab00-42a2-9472-cd761733addc-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.929972 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-l8pb4"] Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.937763 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.937841 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-scripts\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.941341 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.942363 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.947349 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2z6m\" (UniqueName: \"kubernetes.io/projected/0f41b707-ab00-42a2-9472-cd761733addc-kube-api-access-g2z6m\") pod \"cinder-scheduler-0\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " pod="openstack/cinder-scheduler-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.989364 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.992030 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:44:57 crc kubenswrapper[4849]: I0320 13:44:57.998385 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.008261 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030016 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-logs\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030070 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030097 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030160 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-scripts\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030182 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030202 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc9pl\" (UniqueName: \"kubernetes.io/projected/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-kube-api-access-qc9pl\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030260 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030283 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030329 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcbmw\" (UniqueName: \"kubernetes.io/projected/e33ed079-a0fe-4167-98b1-25339aaf90d2-kube-api-access-tcbmw\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030356 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030385 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030406 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-config\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.030445 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.104871 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132087 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-logs\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132222 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132274 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132425 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-scripts\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132465 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132481 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc9pl\" (UniqueName: \"kubernetes.io/projected/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-kube-api-access-qc9pl\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132588 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-logs\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132646 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132706 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132787 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcbmw\" (UniqueName: \"kubernetes.io/projected/e33ed079-a0fe-4167-98b1-25339aaf90d2-kube-api-access-tcbmw\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132807 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132862 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-config\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132882 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.132953 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.133480 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.134180 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.134302 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.134521 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-config\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.134592 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.136880 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.137502 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.137627 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data-custom\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.140162 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-scripts\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.140934 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.152580 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc9pl\" (UniqueName: \"kubernetes.io/projected/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-kube-api-access-qc9pl\") pod \"cinder-api-0\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " pod="openstack/cinder-api-0" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.152830 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcbmw\" (UniqueName: \"kubernetes.io/projected/e33ed079-a0fe-4167-98b1-25339aaf90d2-kube-api-access-tcbmw\") pod \"dnsmasq-dns-5c9776ccc5-l8pb4\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.330226 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:44:58 crc kubenswrapper[4849]: I0320 13:44:58.356419 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.327385 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68775b9c9d-w9j9w"] Mar 20 13:44:59 crc kubenswrapper[4849]: W0320 13:44:59.334679 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbe1119f_65b7_4aea_a636_1d745ea8e3b6.slice/crio-d791cc7cb7410365cf96f81a673a95db803991cdeaac2bd791d4847a3cfbad1c WatchSource:0}: Error finding container d791cc7cb7410365cf96f81a673a95db803991cdeaac2bd791d4847a3cfbad1c: Status 404 returned error can't find the container with id d791cc7cb7410365cf96f81a673a95db803991cdeaac2bd791d4847a3cfbad1c Mar 20 13:44:59 crc kubenswrapper[4849]: E0320 13:44:59.365419 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.467646 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f759b9866-rl5dd"] Mar 20 13:44:59 crc kubenswrapper[4849]: W0320 13:44:59.478125 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf30102de_0f18_4a4a_80e6_58d2de7c230d.slice/crio-809331efff018b9f5ea363470ce58749cb18b72c55f748422b3c3d8ff9082fa4 WatchSource:0}: Error finding container 809331efff018b9f5ea363470ce58749cb18b72c55f748422b3c3d8ff9082fa4: Status 404 returned error can't find the container with id 809331efff018b9f5ea363470ce58749cb18b72c55f748422b3c3d8ff9082fa4 Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.578268 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68775b9c9d-w9j9w" event={"ID":"cbe1119f-65b7-4aea-a636-1d745ea8e3b6","Type":"ContainerStarted","Data":"88e2afcaf5ab460c656c6b5e551da66c2e9fa9aed47628d73269ed7e8e36372d"} Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.578306 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68775b9c9d-w9j9w" event={"ID":"cbe1119f-65b7-4aea-a636-1d745ea8e3b6","Type":"ContainerStarted","Data":"d791cc7cb7410365cf96f81a673a95db803991cdeaac2bd791d4847a3cfbad1c"} Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.580435 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" event={"ID":"f30102de-0f18-4a4a-80e6-58d2de7c230d","Type":"ContainerStarted","Data":"809331efff018b9f5ea363470ce58749cb18b72c55f748422b3c3d8ff9082fa4"} Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.584087 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerStarted","Data":"fa7eb3584de8c092dc7d1bd204c1f96492cbf879e2b8350709197495e74534e6"} Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.584216 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="ceilometer-notification-agent" containerID="cri-o://1544e463846ecc6c58b261cb8a33d96be4cdf71b446787b0a0bf6a66673884ca" gracePeriod=30 Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.584294 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.584573 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="proxy-httpd" containerID="cri-o://fa7eb3584de8c092dc7d1bd204c1f96492cbf879e2b8350709197495e74534e6" gracePeriod=30 Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.584615 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="sg-core" containerID="cri-o://40c515705c039c38d9fd6e728051db6c39e1c019d97d1246f992369906bd4ac0" gracePeriod=30 Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.723800 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-jtrvf"] Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.733936 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:44:59 crc kubenswrapper[4849]: W0320 13:44:59.736987 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f41b707_ab00_42a2_9472_cd761733addc.slice/crio-067a2dae943ba4635a5ebeeae05b8aa787064218aa8b02bcf08912a45a08151b WatchSource:0}: Error finding container 067a2dae943ba4635a5ebeeae05b8aa787064218aa8b02bcf08912a45a08151b: Status 404 returned error can't find the container with id 067a2dae943ba4635a5ebeeae05b8aa787064218aa8b02bcf08912a45a08151b Mar 20 13:44:59 crc kubenswrapper[4849]: W0320 13:44:59.741437 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf29aa501_5db8_44ee_b155_a2ffe7b521bc.slice/crio-4960b3144a5db5998da7fc7cdebea4ded196256ee1257624f3107daac7a09049 WatchSource:0}: Error finding container 4960b3144a5db5998da7fc7cdebea4ded196256ee1257624f3107daac7a09049: Status 404 returned error can't find the container with id 4960b3144a5db5998da7fc7cdebea4ded196256ee1257624f3107daac7a09049 Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.744907 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-845957dc9-clhc5"] Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.754882 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-l8pb4"] Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.977651 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57974449fd-mzjh9"] Mar 20 13:44:59 crc kubenswrapper[4849]: I0320 13:44:59.985894 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:00 crc kubenswrapper[4849]: W0320 13:45:00.101037 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ca49bbd_dd90_4eee_bd44_59934f7d757e.slice/crio-88a398b40587a0158e1a29b3015c80622a46c13555524c7d59d45b27c499cbb0 WatchSource:0}: Error finding container 88a398b40587a0158e1a29b3015c80622a46c13555524c7d59d45b27c499cbb0: Status 404 returned error can't find the container with id 88a398b40587a0158e1a29b3015c80622a46c13555524c7d59d45b27c499cbb0 Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.150658 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4"] Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.167288 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.172463 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.172512 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.225979 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4"] Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.248868 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.313256 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5983a674-e322-403f-9578-39bed02774b4-secret-volume\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.314700 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6vn\" (UniqueName: \"kubernetes.io/projected/5983a674-e322-403f-9578-39bed02774b4-kube-api-access-kx6vn\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.314772 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5983a674-e322-403f-9578-39bed02774b4-config-volume\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.416671 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5983a674-e322-403f-9578-39bed02774b4-secret-volume\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.416721 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6vn\" (UniqueName: \"kubernetes.io/projected/5983a674-e322-403f-9578-39bed02774b4-kube-api-access-kx6vn\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.416749 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5983a674-e322-403f-9578-39bed02774b4-config-volume\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.417699 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5983a674-e322-403f-9578-39bed02774b4-config-volume\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.426939 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5983a674-e322-403f-9578-39bed02774b4-secret-volume\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.439484 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6vn\" (UniqueName: \"kubernetes.io/projected/5983a674-e322-403f-9578-39bed02774b4-kube-api-access-kx6vn\") pod \"collect-profiles-29566905-4fqs4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.490158 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.619342 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-845957dc9-clhc5" event={"ID":"f29aa501-5db8-44ee-b155-a2ffe7b521bc","Type":"ContainerStarted","Data":"4960b3144a5db5998da7fc7cdebea4ded196256ee1257624f3107daac7a09049"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.650657 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b51aadb-067d-4151-a0ba-fbbc5eb0625c","Type":"ContainerStarted","Data":"83569adb620484b327a2926a1fb3a83b19ffdcff5e9250b2883c32327d18857b"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.656694 4849 generic.go:334] "Generic (PLEG): container finished" podID="8793c8a1-a4d0-4d56-a889-0ae37233bb1f" containerID="ab254e6ddd8a0ec0318f11117219560e9622da5e78a99de6eb56e8e50ec641ad" exitCode=0 Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.656783 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" event={"ID":"8793c8a1-a4d0-4d56-a889-0ae37233bb1f","Type":"ContainerDied","Data":"ab254e6ddd8a0ec0318f11117219560e9622da5e78a99de6eb56e8e50ec641ad"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.656862 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" event={"ID":"8793c8a1-a4d0-4d56-a889-0ae37233bb1f","Type":"ContainerStarted","Data":"9066b5b0336d45a1b15bad709732e9159f261c91318822ba4ca2dd68592e701a"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.668402 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f41b707-ab00-42a2-9472-cd761733addc","Type":"ContainerStarted","Data":"067a2dae943ba4635a5ebeeae05b8aa787064218aa8b02bcf08912a45a08151b"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.681138 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57974449fd-mzjh9" event={"ID":"5ca49bbd-dd90-4eee-bd44-59934f7d757e","Type":"ContainerStarted","Data":"3b4dde763ede66ee6751d9c3417eeafb2e11c589d3a8e4147ad97c4eefedba14"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.681193 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57974449fd-mzjh9" event={"ID":"5ca49bbd-dd90-4eee-bd44-59934f7d757e","Type":"ContainerStarted","Data":"88a398b40587a0158e1a29b3015c80622a46c13555524c7d59d45b27c499cbb0"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.696465 4849 generic.go:334] "Generic (PLEG): container finished" podID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerID="246ee0bf4da946bf523f8fcf85b311a41ea0836b58f6c64d7b1a24e7b689e3d1" exitCode=0 Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.696598 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" event={"ID":"e33ed079-a0fe-4167-98b1-25339aaf90d2","Type":"ContainerDied","Data":"246ee0bf4da946bf523f8fcf85b311a41ea0836b58f6c64d7b1a24e7b689e3d1"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.696629 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" event={"ID":"e33ed079-a0fe-4167-98b1-25339aaf90d2","Type":"ContainerStarted","Data":"5cebf946227dfa73efba59d090e2a97b33fca33984567fe85d71ef28f3e01b76"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.703341 4849 generic.go:334] "Generic (PLEG): container finished" podID="5e321362-ff76-4c26-bbde-9a97617ca460" containerID="fa7eb3584de8c092dc7d1bd204c1f96492cbf879e2b8350709197495e74534e6" exitCode=0 Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.703378 4849 generic.go:334] "Generic (PLEG): container finished" podID="5e321362-ff76-4c26-bbde-9a97617ca460" containerID="40c515705c039c38d9fd6e728051db6c39e1c019d97d1246f992369906bd4ac0" exitCode=2 Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.703387 4849 generic.go:334] "Generic (PLEG): container finished" podID="5e321362-ff76-4c26-bbde-9a97617ca460" containerID="1544e463846ecc6c58b261cb8a33d96be4cdf71b446787b0a0bf6a66673884ca" exitCode=0 Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.703434 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerDied","Data":"fa7eb3584de8c092dc7d1bd204c1f96492cbf879e2b8350709197495e74534e6"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.703459 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerDied","Data":"40c515705c039c38d9fd6e728051db6c39e1c019d97d1246f992369906bd4ac0"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.703472 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerDied","Data":"1544e463846ecc6c58b261cb8a33d96be4cdf71b446787b0a0bf6a66673884ca"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.707779 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68775b9c9d-w9j9w" event={"ID":"cbe1119f-65b7-4aea-a636-1d745ea8e3b6","Type":"ContainerStarted","Data":"ab8a46b715e52e3ac87fe3c623909c7c587b58ae125efb3e3139ea2a60f86f72"} Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.708104 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.708177 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.761037 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68775b9c9d-w9j9w" podStartSLOduration=4.76101819 podStartE2EDuration="4.76101819s" podCreationTimestamp="2026-03-20 13:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:00.753148001 +0000 UTC m=+1250.430871416" watchObservedRunningTime="2026-03-20 13:45:00.76101819 +0000 UTC m=+1250.438741585" Mar 20 13:45:00 crc kubenswrapper[4849]: I0320 13:45:00.961853 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039198 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-run-httpd\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039275 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-log-httpd\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039345 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-config-data\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039376 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-sg-core-conf-yaml\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039458 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-combined-ca-bundle\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039499 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fthp2\" (UniqueName: \"kubernetes.io/projected/5e321362-ff76-4c26-bbde-9a97617ca460-kube-api-access-fthp2\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.039582 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-scripts\") pod \"5e321362-ff76-4c26-bbde-9a97617ca460\" (UID: \"5e321362-ff76-4c26-bbde-9a97617ca460\") " Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.045752 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.048499 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.052717 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-scripts" (OuterVolumeSpecName: "scripts") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.061102 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e321362-ff76-4c26-bbde-9a97617ca460-kube-api-access-fthp2" (OuterVolumeSpecName: "kube-api-access-fthp2") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "kube-api-access-fthp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.148231 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.148285 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e321362-ff76-4c26-bbde-9a97617ca460-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.148299 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fthp2\" (UniqueName: \"kubernetes.io/projected/5e321362-ff76-4c26-bbde-9a97617ca460-kube-api-access-fthp2\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.148316 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.153876 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4"] Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.157857 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.197683 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.229879 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-config-data" (OuterVolumeSpecName: "config-data") pod "5e321362-ff76-4c26-bbde-9a97617ca460" (UID: "5e321362-ff76-4c26-bbde-9a97617ca460"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.249604 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.249633 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.249642 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e321362-ff76-4c26-bbde-9a97617ca460-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.719210 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" event={"ID":"e33ed079-a0fe-4167-98b1-25339aaf90d2","Type":"ContainerStarted","Data":"371b262defdab34bf7aa2f4f1abe9721f0b0f3fca163eace16b6672e2f381e1c"} Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.719580 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.721804 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e321362-ff76-4c26-bbde-9a97617ca460","Type":"ContainerDied","Data":"9654243ab408b918f14113295b644a6adcb4933523b20481b3368be5fd4aadea"} Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.721839 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.721868 4849 scope.go:117] "RemoveContainer" containerID="fa7eb3584de8c092dc7d1bd204c1f96492cbf879e2b8350709197495e74534e6" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.728400 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b51aadb-067d-4151-a0ba-fbbc5eb0625c","Type":"ContainerStarted","Data":"5a65a331c48ab1d8ab23e399382aadb7fd7682dc4af09626349dddeb49e2206b"} Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.742423 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57974449fd-mzjh9" event={"ID":"5ca49bbd-dd90-4eee-bd44-59934f7d757e","Type":"ContainerStarted","Data":"4ff8295efe4b28b1dea15510201a5538d89fa33809686080fcb5724567d00e06"} Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.742468 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.742502 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.758514 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" podStartSLOduration=4.758496276 podStartE2EDuration="4.758496276s" podCreationTimestamp="2026-03-20 13:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:01.740184741 +0000 UTC m=+1251.417908156" watchObservedRunningTime="2026-03-20 13:45:01.758496276 +0000 UTC m=+1251.436219671" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.794998 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.806110 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.831848 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:01 crc kubenswrapper[4849]: E0320 13:45:01.832623 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="ceilometer-notification-agent" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.832648 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="ceilometer-notification-agent" Mar 20 13:45:01 crc kubenswrapper[4849]: E0320 13:45:01.832683 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="sg-core" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.832692 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="sg-core" Mar 20 13:45:01 crc kubenswrapper[4849]: E0320 13:45:01.833028 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="proxy-httpd" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.833050 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="proxy-httpd" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.833559 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="proxy-httpd" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.833621 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="sg-core" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.833641 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" containerName="ceilometer-notification-agent" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.837759 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.841505 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.841704 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.845972 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57974449fd-mzjh9" podStartSLOduration=8.845948212 podStartE2EDuration="8.845948212s" podCreationTimestamp="2026-03-20 13:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:01.791303325 +0000 UTC m=+1251.469026730" watchObservedRunningTime="2026-03-20 13:45:01.845948212 +0000 UTC m=+1251.523671627" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.874409 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.913970 4849 scope.go:117] "RemoveContainer" containerID="40c515705c039c38d9fd6e728051db6c39e1c019d97d1246f992369906bd4ac0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.963904 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-scripts\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.963951 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-run-httpd\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.964034 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-config-data\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.964189 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.964445 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-log-httpd\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.964685 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfnnh\" (UniqueName: \"kubernetes.io/projected/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-kube-api-access-bfnnh\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:01 crc kubenswrapper[4849]: I0320 13:45:01.964859 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.006487 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066289 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-config\") pod \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066385 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dplm5\" (UniqueName: \"kubernetes.io/projected/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-kube-api-access-dplm5\") pod \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066425 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-swift-storage-0\") pod \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066478 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-nb\") pod \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066516 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-sb\") pod \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066625 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-svc\") pod \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\" (UID: \"8793c8a1-a4d0-4d56-a889-0ae37233bb1f\") " Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066901 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfnnh\" (UniqueName: \"kubernetes.io/projected/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-kube-api-access-bfnnh\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.066937 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.067012 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-scripts\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.067037 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-run-httpd\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.067119 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-config-data\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.067150 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.067229 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-log-httpd\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.069282 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-log-httpd\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.070248 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-kube-api-access-dplm5" (OuterVolumeSpecName: "kube-api-access-dplm5") pod "8793c8a1-a4d0-4d56-a889-0ae37233bb1f" (UID: "8793c8a1-a4d0-4d56-a889-0ae37233bb1f"). InnerVolumeSpecName "kube-api-access-dplm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.070555 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-run-httpd\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.074129 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-scripts\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.075391 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.083550 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-config-data\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.084050 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.085709 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfnnh\" (UniqueName: \"kubernetes.io/projected/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-kube-api-access-bfnnh\") pod \"ceilometer-0\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.098139 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8793c8a1-a4d0-4d56-a889-0ae37233bb1f" (UID: "8793c8a1-a4d0-4d56-a889-0ae37233bb1f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.099208 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-config" (OuterVolumeSpecName: "config") pod "8793c8a1-a4d0-4d56-a889-0ae37233bb1f" (UID: "8793c8a1-a4d0-4d56-a889-0ae37233bb1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.104219 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8793c8a1-a4d0-4d56-a889-0ae37233bb1f" (UID: "8793c8a1-a4d0-4d56-a889-0ae37233bb1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.107862 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8793c8a1-a4d0-4d56-a889-0ae37233bb1f" (UID: "8793c8a1-a4d0-4d56-a889-0ae37233bb1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.109948 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8793c8a1-a4d0-4d56-a889-0ae37233bb1f" (UID: "8793c8a1-a4d0-4d56-a889-0ae37233bb1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.167609 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.168761 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.168795 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dplm5\" (UniqueName: \"kubernetes.io/projected/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-kube-api-access-dplm5\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.168807 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.168835 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.168847 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.168858 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8793c8a1-a4d0-4d56-a889-0ae37233bb1f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.472766 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.705207 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79ff4b8df9-qp9mn"] Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.706030 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79ff4b8df9-qp9mn" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-api" containerID="cri-o://1b39e59dfea417f45beb77beb7be33544ccf333b38fbba38f6ed15ea2d745a77" gracePeriod=30 Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.706175 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79ff4b8df9-qp9mn" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-httpd" containerID="cri-o://941061793fb6895d944055b6d1b460415519760a1c867b677555e545de84da5d" gracePeriod=30 Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.724261 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-749b7fc4bf-nwzdf"] Mar 20 13:45:02 crc kubenswrapper[4849]: E0320 13:45:02.724638 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8793c8a1-a4d0-4d56-a889-0ae37233bb1f" containerName="init" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.724651 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="8793c8a1-a4d0-4d56-a889-0ae37233bb1f" containerName="init" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.725964 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="8793c8a1-a4d0-4d56-a889-0ae37233bb1f" containerName="init" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.727514 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.731609 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.753161 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.753158 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-jtrvf" event={"ID":"8793c8a1-a4d0-4d56-a889-0ae37233bb1f","Type":"ContainerDied","Data":"9066b5b0336d45a1b15bad709732e9159f261c91318822ba4ca2dd68592e701a"} Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.757741 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" event={"ID":"5983a674-e322-403f-9578-39bed02774b4","Type":"ContainerStarted","Data":"1ead021422e0fd8ff8aecf04fe4a1cfece3284e45685efee6372958cc468662a"} Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.763279 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f41b707-ab00-42a2-9472-cd761733addc","Type":"ContainerStarted","Data":"76caa296c758a2d6f13892e5c663cd03a8abba4518249fa55ad4c3ca8bf083a5"} Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.764976 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-749b7fc4bf-nwzdf"] Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781604 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz927\" (UniqueName: \"kubernetes.io/projected/8d4c5d75-d3c3-4035-852d-026e9183444c-kube-api-access-tz927\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781700 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-httpd-config\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781751 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-ovndb-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781770 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-public-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781812 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-combined-ca-bundle\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781884 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-config\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.781978 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-internal-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.796201 4849 scope.go:117] "RemoveContainer" containerID="1544e463846ecc6c58b261cb8a33d96be4cdf71b446787b0a0bf6a66673884ca" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.885342 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-internal-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.885917 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz927\" (UniqueName: \"kubernetes.io/projected/8d4c5d75-d3c3-4035-852d-026e9183444c-kube-api-access-tz927\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.885987 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-httpd-config\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.886053 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-public-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.886074 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-ovndb-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.886121 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-combined-ca-bundle\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.886305 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-config\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.892755 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-internal-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.899753 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-httpd-config\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.900612 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-config\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.901006 4849 scope.go:117] "RemoveContainer" containerID="ab254e6ddd8a0ec0318f11117219560e9622da5e78a99de6eb56e8e50ec641ad" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.901838 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-ovndb-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.902873 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-jtrvf"] Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.905780 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-public-tls-certs\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.910800 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz927\" (UniqueName: \"kubernetes.io/projected/8d4c5d75-d3c3-4035-852d-026e9183444c-kube-api-access-tz927\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.911815 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4c5d75-d3c3-4035-852d-026e9183444c-combined-ca-bundle\") pod \"neutron-749b7fc4bf-nwzdf\" (UID: \"8d4c5d75-d3c3-4035-852d-026e9183444c\") " pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:02 crc kubenswrapper[4849]: I0320 13:45:02.935091 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-jtrvf"] Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.052750 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e321362-ff76-4c26-bbde-9a97617ca460" path="/var/lib/kubelet/pods/5e321362-ff76-4c26-bbde-9a97617ca460/volumes" Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.058538 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8793c8a1-a4d0-4d56-a889-0ae37233bb1f" path="/var/lib/kubelet/pods/8793c8a1-a4d0-4d56-a889-0ae37233bb1f/volumes" Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.058696 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.446596 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.816248 4849 generic.go:334] "Generic (PLEG): container finished" podID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerID="941061793fb6895d944055b6d1b460415519760a1c867b677555e545de84da5d" exitCode=0 Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.816304 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79ff4b8df9-qp9mn" event={"ID":"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d","Type":"ContainerDied","Data":"941061793fb6895d944055b6d1b460415519760a1c867b677555e545de84da5d"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.824118 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" event={"ID":"f30102de-0f18-4a4a-80e6-58d2de7c230d","Type":"ContainerStarted","Data":"fd38a465372b62399eebf7556e9e221d5954a3ca1ec21c934874a0ab8e2ff414"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.824400 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" event={"ID":"f30102de-0f18-4a4a-80e6-58d2de7c230d","Type":"ContainerStarted","Data":"bbcea13fb6832b8af89dba01e11a85f52081142cc714838eb724c2ba1962df2f"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.827451 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-845957dc9-clhc5" event={"ID":"f29aa501-5db8-44ee-b155-a2ffe7b521bc","Type":"ContainerStarted","Data":"fb6f9c4cfc6219d7c9c13fca253706723e992b4745185b81564b5c301b135849"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.827478 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-845957dc9-clhc5" event={"ID":"f29aa501-5db8-44ee-b155-a2ffe7b521bc","Type":"ContainerStarted","Data":"0cf1f6a760db58e3ba57b5e23b363da144a554a60763c8c91b8822cb55e8db2c"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.843894 4849 generic.go:334] "Generic (PLEG): container finished" podID="5983a674-e322-403f-9578-39bed02774b4" containerID="5e3e655074b74828cb5977b6b4f5a29f47c84d7b37304febbb906b0ec593ffd2" exitCode=0 Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.843960 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" event={"ID":"5983a674-e322-403f-9578-39bed02774b4","Type":"ContainerDied","Data":"5e3e655074b74828cb5977b6b4f5a29f47c84d7b37304febbb906b0ec593ffd2"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.848773 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerStarted","Data":"0b2df2e637d33a9ac3775e67fe1cb3353181c19587c1d52030a7f2072d3be7a6"} Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.851210 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f759b9866-rl5dd" podStartSLOduration=7.50466425 podStartE2EDuration="10.85118801s" podCreationTimestamp="2026-03-20 13:44:53 +0000 UTC" firstStartedPulling="2026-03-20 13:44:59.4810286 +0000 UTC m=+1249.158751995" lastFinishedPulling="2026-03-20 13:45:02.82755236 +0000 UTC m=+1252.505275755" observedRunningTime="2026-03-20 13:45:03.84477727 +0000 UTC m=+1253.522500685" watchObservedRunningTime="2026-03-20 13:45:03.85118801 +0000 UTC m=+1253.528911415" Mar 20 13:45:03 crc kubenswrapper[4849]: I0320 13:45:03.974427 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-845957dc9-clhc5" podStartSLOduration=7.831743493 podStartE2EDuration="10.974402263s" podCreationTimestamp="2026-03-20 13:44:53 +0000 UTC" firstStartedPulling="2026-03-20 13:44:59.743137132 +0000 UTC m=+1249.420860527" lastFinishedPulling="2026-03-20 13:45:02.885795902 +0000 UTC m=+1252.563519297" observedRunningTime="2026-03-20 13:45:03.887275425 +0000 UTC m=+1253.564998820" watchObservedRunningTime="2026-03-20 13:45:03.974402263 +0000 UTC m=+1253.652125688" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.064991 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-749b7fc4bf-nwzdf"] Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.151405 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.560375 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.586758 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79ff4b8df9-qp9mn" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.892020 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b7fc4bf-nwzdf" event={"ID":"8d4c5d75-d3c3-4035-852d-026e9183444c","Type":"ContainerStarted","Data":"58a8c9938bbaa24923a7e53efb19616168776212d75307d48303cd350ded54a7"} Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.892061 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b7fc4bf-nwzdf" event={"ID":"8d4c5d75-d3c3-4035-852d-026e9183444c","Type":"ContainerStarted","Data":"69078b833e2fcd0cafc6aa273b6d70efbb97f6d5e6a666f6e6034ac41f05d37c"} Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.892071 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-749b7fc4bf-nwzdf" event={"ID":"8d4c5d75-d3c3-4035-852d-026e9183444c","Type":"ContainerStarted","Data":"d46d876ca77d795e990f3b19af1fc43bf69f4b21402bf86a6f3bdd424fe202fd"} Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.893123 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.911770 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b51aadb-067d-4151-a0ba-fbbc5eb0625c","Type":"ContainerStarted","Data":"ff6186583e332069f84c4f148003b0339cd1ce03a24bff565e7dfb5bfbac3f2e"} Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.911951 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api-log" containerID="cri-o://5a65a331c48ab1d8ab23e399382aadb7fd7682dc4af09626349dddeb49e2206b" gracePeriod=30 Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.912176 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.912211 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api" containerID="cri-o://ff6186583e332069f84c4f148003b0339cd1ce03a24bff565e7dfb5bfbac3f2e" gracePeriod=30 Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.915587 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerStarted","Data":"5c8ee983feef9979881c73e49772707c20c468ba347e3a33f0805a3e23087e6f"} Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.923129 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-749b7fc4bf-nwzdf" podStartSLOduration=2.923107998 podStartE2EDuration="2.923107998s" podCreationTimestamp="2026-03-20 13:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:04.913142074 +0000 UTC m=+1254.590865479" watchObservedRunningTime="2026-03-20 13:45:04.923107998 +0000 UTC m=+1254.600831393" Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.928623 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f41b707-ab00-42a2-9472-cd761733addc","Type":"ContainerStarted","Data":"47ad10e192eab044a506b16f61b3d058fe52c5d53ead1431411be3904fe2b71f"} Mar 20 13:45:04 crc kubenswrapper[4849]: I0320 13:45:04.999152 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.9991306909999995 podStartE2EDuration="7.999130691s" podCreationTimestamp="2026-03-20 13:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:04.941771952 +0000 UTC m=+1254.619495347" watchObservedRunningTime="2026-03-20 13:45:04.999130691 +0000 UTC m=+1254.676854086" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:04.999781 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.080740918 podStartE2EDuration="7.999776038s" podCreationTimestamp="2026-03-20 13:44:57 +0000 UTC" firstStartedPulling="2026-03-20 13:44:59.741813847 +0000 UTC m=+1249.419537242" lastFinishedPulling="2026-03-20 13:45:00.660848967 +0000 UTC m=+1250.338572362" observedRunningTime="2026-03-20 13:45:04.980349524 +0000 UTC m=+1254.658072919" watchObservedRunningTime="2026-03-20 13:45:04.999776038 +0000 UTC m=+1254.677499433" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.414568 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.486066 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6vn\" (UniqueName: \"kubernetes.io/projected/5983a674-e322-403f-9578-39bed02774b4-kube-api-access-kx6vn\") pod \"5983a674-e322-403f-9578-39bed02774b4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.486286 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5983a674-e322-403f-9578-39bed02774b4-config-volume\") pod \"5983a674-e322-403f-9578-39bed02774b4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.486395 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5983a674-e322-403f-9578-39bed02774b4-secret-volume\") pod \"5983a674-e322-403f-9578-39bed02774b4\" (UID: \"5983a674-e322-403f-9578-39bed02774b4\") " Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.487836 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5983a674-e322-403f-9578-39bed02774b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "5983a674-e322-403f-9578-39bed02774b4" (UID: "5983a674-e322-403f-9578-39bed02774b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.499032 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5983a674-e322-403f-9578-39bed02774b4-kube-api-access-kx6vn" (OuterVolumeSpecName: "kube-api-access-kx6vn") pod "5983a674-e322-403f-9578-39bed02774b4" (UID: "5983a674-e322-403f-9578-39bed02774b4"). InnerVolumeSpecName "kube-api-access-kx6vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.502147 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5983a674-e322-403f-9578-39bed02774b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5983a674-e322-403f-9578-39bed02774b4" (UID: "5983a674-e322-403f-9578-39bed02774b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.590111 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx6vn\" (UniqueName: \"kubernetes.io/projected/5983a674-e322-403f-9578-39bed02774b4-kube-api-access-kx6vn\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.590158 4849 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5983a674-e322-403f-9578-39bed02774b4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.590174 4849 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5983a674-e322-403f-9578-39bed02774b4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.971881 4849 generic.go:334] "Generic (PLEG): container finished" podID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerID="ff6186583e332069f84c4f148003b0339cd1ce03a24bff565e7dfb5bfbac3f2e" exitCode=0 Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.972153 4849 generic.go:334] "Generic (PLEG): container finished" podID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerID="5a65a331c48ab1d8ab23e399382aadb7fd7682dc4af09626349dddeb49e2206b" exitCode=143 Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.972203 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b51aadb-067d-4151-a0ba-fbbc5eb0625c","Type":"ContainerDied","Data":"ff6186583e332069f84c4f148003b0339cd1ce03a24bff565e7dfb5bfbac3f2e"} Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.972231 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b51aadb-067d-4151-a0ba-fbbc5eb0625c","Type":"ContainerDied","Data":"5a65a331c48ab1d8ab23e399382aadb7fd7682dc4af09626349dddeb49e2206b"} Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.974234 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.975635 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-4fqs4" event={"ID":"5983a674-e322-403f-9578-39bed02774b4","Type":"ContainerDied","Data":"1ead021422e0fd8ff8aecf04fe4a1cfece3284e45685efee6372958cc468662a"} Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.975664 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ead021422e0fd8ff8aecf04fe4a1cfece3284e45685efee6372958cc468662a" Mar 20 13:45:05 crc kubenswrapper[4849]: I0320 13:45:05.991952 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerStarted","Data":"5edbe1ef1aa6533e60349eea9ee2f7764f5cd71015c2a31a495f6aedb801a7f3"} Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.007046 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.015064 4849 generic.go:334] "Generic (PLEG): container finished" podID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerID="1b39e59dfea417f45beb77beb7be33544ccf333b38fbba38f6ed15ea2d745a77" exitCode=0 Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.015143 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79ff4b8df9-qp9mn" event={"ID":"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d","Type":"ContainerDied","Data":"1b39e59dfea417f45beb77beb7be33544ccf333b38fbba38f6ed15ea2d745a77"} Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.064687 4849 generic.go:334] "Generic (PLEG): container finished" podID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerID="edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88" exitCode=137 Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.064717 4849 generic.go:334] "Generic (PLEG): container finished" podID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerID="60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7" exitCode=137 Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.065240 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-79d4788db5-tz9b5" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.065600 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d4788db5-tz9b5" event={"ID":"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7","Type":"ContainerDied","Data":"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88"} Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.065626 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d4788db5-tz9b5" event={"ID":"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7","Type":"ContainerDied","Data":"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7"} Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.065636 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-79d4788db5-tz9b5" event={"ID":"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7","Type":"ContainerDied","Data":"99786e49ab601b6814a3f65c923db713f4f2d094ca56a6e90168316dd17d6bcd"} Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.065655 4849 scope.go:117] "RemoveContainer" containerID="edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.104654 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-config-data\") pod \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.104849 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v2hj\" (UniqueName: \"kubernetes.io/projected/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-kube-api-access-9v2hj\") pod \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.104892 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-horizon-secret-key\") pod \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.104980 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-scripts\") pod \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.105052 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-logs\") pod \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\" (UID: \"ef4d07e7-fc99-4d1c-b424-01dd7a58edd7\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.118412 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-logs" (OuterVolumeSpecName: "logs") pod "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" (UID: "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.129930 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.138062 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" (UID: "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.164081 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-kube-api-access-9v2hj" (OuterVolumeSpecName: "kube-api-access-9v2hj") pod "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" (UID: "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7"). InnerVolumeSpecName "kube-api-access-9v2hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.167275 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.170178 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-scripts" (OuterVolumeSpecName: "scripts") pod "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" (UID: "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.190755 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-config-data" (OuterVolumeSpecName: "config-data") pod "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" (UID: "ef4d07e7-fc99-4d1c-b424-01dd7a58edd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241023 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-config\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241136 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-combined-ca-bundle\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241187 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-combined-ca-bundle\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241211 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-etc-machine-id\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241240 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl2tq\" (UniqueName: \"kubernetes.io/projected/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-kube-api-access-kl2tq\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241376 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241394 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data-custom\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241437 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-httpd-config\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241535 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-logs\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241608 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-public-tls-certs\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241626 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-scripts\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241644 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-internal-tls-certs\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241701 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc9pl\" (UniqueName: \"kubernetes.io/projected/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-kube-api-access-qc9pl\") pod \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\" (UID: \"6b51aadb-067d-4151-a0ba-fbbc5eb0625c\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.241717 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-ovndb-tls-certs\") pod \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\" (UID: \"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d\") " Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.251436 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.252942 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259597 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v2hj\" (UniqueName: \"kubernetes.io/projected/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-kube-api-access-9v2hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259642 4849 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259652 4849 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259663 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259674 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259682 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.259692 4849 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.262266 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-logs" (OuterVolumeSpecName: "logs") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.264222 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-scripts" (OuterVolumeSpecName: "scripts") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.268156 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-kube-api-access-qc9pl" (OuterVolumeSpecName: "kube-api-access-qc9pl") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "kube-api-access-qc9pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.268235 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-kube-api-access-kl2tq" (OuterVolumeSpecName: "kube-api-access-kl2tq") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "kube-api-access-kl2tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.287709 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.345435 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.361806 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl2tq\" (UniqueName: \"kubernetes.io/projected/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-kube-api-access-kl2tq\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.361879 4849 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.361889 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.361900 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.361908 4849 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.361917 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc9pl\" (UniqueName: \"kubernetes.io/projected/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-kube-api-access-qc9pl\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.375953 4849 scope.go:117] "RemoveContainer" containerID="60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.381039 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.381934 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.425787 4849 scope.go:117] "RemoveContainer" containerID="edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88" Mar 20 13:45:06 crc kubenswrapper[4849]: E0320 13:45:06.431250 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88\": container with ID starting with edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88 not found: ID does not exist" containerID="edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.431289 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88"} err="failed to get container status \"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88\": rpc error: code = NotFound desc = could not find container \"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88\": container with ID starting with edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88 not found: ID does not exist" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.431312 4849 scope.go:117] "RemoveContainer" containerID="60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.434845 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: E0320 13:45:06.435203 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7\": container with ID starting with 60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7 not found: ID does not exist" containerID="60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.435315 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7"} err="failed to get container status \"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7\": rpc error: code = NotFound desc = could not find container \"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7\": container with ID starting with 60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7 not found: ID does not exist" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.435348 4849 scope.go:117] "RemoveContainer" containerID="edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.435802 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88"} err="failed to get container status \"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88\": rpc error: code = NotFound desc = could not find container \"edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88\": container with ID starting with edb92aca0d2606b066379bad93d4a88b4621049eb40b2fed21b751f32a125c88 not found: ID does not exist" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.435912 4849 scope.go:117] "RemoveContainer" containerID="60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.436191 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7"} err="failed to get container status \"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7\": rpc error: code = NotFound desc = could not find container \"60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7\": container with ID starting with 60790b8d502b69659d73eb12dfdca84ae033de21dfb3411c91cd3f14a5308ae7 not found: ID does not exist" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.441862 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data" (OuterVolumeSpecName: "config-data") pod "6b51aadb-067d-4151-a0ba-fbbc5eb0625c" (UID: "6b51aadb-067d-4151-a0ba-fbbc5eb0625c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.453208 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-config" (OuterVolumeSpecName: "config") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.463626 4849 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.463969 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.464017 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.464030 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.464042 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b51aadb-067d-4151-a0ba-fbbc5eb0625c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.466082 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-79d4788db5-tz9b5"] Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.501028 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" (UID: "a5555c6f-b480-42a0-a2cd-e3ad41c74a2d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.510970 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-79d4788db5-tz9b5"] Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.567572 4849 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:06 crc kubenswrapper[4849]: I0320 13:45:06.857448 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.052258 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" path="/var/lib/kubelet/pods/ef4d07e7-fc99-4d1c-b424-01dd7a58edd7/volumes" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.086851 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79ff4b8df9-qp9mn" event={"ID":"a5555c6f-b480-42a0-a2cd-e3ad41c74a2d","Type":"ContainerDied","Data":"8fa5940f1022dd7512b6037c04cf7f862d14246b995b2fb4036b53c38847bbdd"} Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.087103 4849 scope.go:117] "RemoveContainer" containerID="941061793fb6895d944055b6d1b460415519760a1c867b677555e545de84da5d" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.087199 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79ff4b8df9-qp9mn" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.109481 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.109698 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6b51aadb-067d-4151-a0ba-fbbc5eb0625c","Type":"ContainerDied","Data":"83569adb620484b327a2926a1fb3a83b19ffdcff5e9250b2883c32327d18857b"} Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.146007 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79ff4b8df9-qp9mn"] Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.160761 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79ff4b8df9-qp9mn"] Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.167467 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.191796 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.205879 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206271 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5983a674-e322-403f-9578-39bed02774b4" containerName="collect-profiles" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206295 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5983a674-e322-403f-9578-39bed02774b4" containerName="collect-profiles" Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206313 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206323 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon" Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206340 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api-log" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206347 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api-log" Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206356 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-api" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206364 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-api" Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206373 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206381 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api" Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206407 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-httpd" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206415 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-httpd" Mar 20 13:45:07 crc kubenswrapper[4849]: E0320 13:45:07.206438 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon-log" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.206446 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon-log" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207741 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207772 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-api" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207795 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" containerName="neutron-httpd" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207810 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api-log" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207921 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" containerName="cinder-api" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207937 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5983a674-e322-403f-9578-39bed02774b4" containerName="collect-profiles" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.207951 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef4d07e7-fc99-4d1c-b424-01dd7a58edd7" containerName="horizon-log" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.209363 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.212574 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.212800 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.213014 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.216336 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.241145 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7f784755c6-j267c" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.305373 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68899bcb64-snjqk"] Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.305577 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68899bcb64-snjqk" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon-log" containerID="cri-o://0a9174601ae80fb33d84c0233114f71b6f537dcd6ab8a11c4d2eb6c6ce77db76" gracePeriod=30 Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.305997 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-68899bcb64-snjqk" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" containerID="cri-o://717f03af8e71b4627fc46deedfa0ff5f51c7251a3525a9f6fbfab7cf39df9d1d" gracePeriod=30 Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.342478 4849 scope.go:117] "RemoveContainer" containerID="1b39e59dfea417f45beb77beb7be33544ccf333b38fbba38f6ed15ea2d745a77" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.375736 4849 scope.go:117] "RemoveContainer" containerID="ff6186583e332069f84c4f148003b0339cd1ce03a24bff565e7dfb5bfbac3f2e" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384369 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384465 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-config-data-custom\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384497 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384516 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-config-data\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384566 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384592 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-logs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384614 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb4nf\" (UniqueName: \"kubernetes.io/projected/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-kube-api-access-lb4nf\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384643 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.384658 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-scripts\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.485813 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486126 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-scripts\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486284 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486441 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-config-data-custom\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486556 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486651 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-config-data\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486788 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.486933 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-logs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.487049 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb4nf\" (UniqueName: \"kubernetes.io/projected/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-kube-api-access-lb4nf\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.488116 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.488436 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-logs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.494810 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.499758 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-config-data\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.501384 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.502155 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-scripts\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.504985 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-config-data-custom\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.512542 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb4nf\" (UniqueName: \"kubernetes.io/projected/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-kube-api-access-lb4nf\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.512580 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c13e7-1cf2-4e2a-993b-83f1aa9428cb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb\") " pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.531243 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:45:07 crc kubenswrapper[4849]: I0320 13:45:07.567713 4849 scope.go:117] "RemoveContainer" containerID="5a65a331c48ab1d8ab23e399382aadb7fd7682dc4af09626349dddeb49e2206b" Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.105849 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.120074 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerStarted","Data":"93fa7757d63ec3636c95a1fa1e1a6545841f3e38217c7845960e3239eeaccbb1"} Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.148498 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.332023 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.387781 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h6lmv"] Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.394950 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerName="dnsmasq-dns" containerID="cri-o://0d89e398b4272f12dbec65de3b811005b6c969c4cdcee69ddc8c1ee9f61cccd5" gracePeriod=10 Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.410264 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:45:08 crc kubenswrapper[4849]: I0320 13:45:08.488050 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.047692 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b51aadb-067d-4151-a0ba-fbbc5eb0625c" path="/var/lib/kubelet/pods/6b51aadb-067d-4151-a0ba-fbbc5eb0625c/volumes" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.054925 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5555c6f-b480-42a0-a2cd-e3ad41c74a2d" path="/var/lib/kubelet/pods/a5555c6f-b480-42a0-a2cd-e3ad41c74a2d/volumes" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.148539 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.184050 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb","Type":"ContainerStarted","Data":"bb7cfd3bc3fe622ecc58f106e9784deb7adaf5429971ecee1f2abf841b6effad"} Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.184094 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb","Type":"ContainerStarted","Data":"edbd95a13cb5a379463ff3234cc6f96ee1f7d2fccb462de92c2580d433b85a3a"} Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.189916 4849 generic.go:334] "Generic (PLEG): container finished" podID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerID="0d89e398b4272f12dbec65de3b811005b6c969c4cdcee69ddc8c1ee9f61cccd5" exitCode=0 Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.190230 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="cinder-scheduler" containerID="cri-o://76caa296c758a2d6f13892e5c663cd03a8abba4518249fa55ad4c3ca8bf083a5" gracePeriod=30 Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.190351 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" event={"ID":"effa400c-2afb-4abd-a644-6c15cdad3ee1","Type":"ContainerDied","Data":"0d89e398b4272f12dbec65de3b811005b6c969c4cdcee69ddc8c1ee9f61cccd5"} Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.190386 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" event={"ID":"effa400c-2afb-4abd-a644-6c15cdad3ee1","Type":"ContainerDied","Data":"dcff2ab7e0c2bde67672d68ddf76362cc2f9ad08c2ffa831f21fc4fdd9dcb0cb"} Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.190398 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcff2ab7e0c2bde67672d68ddf76362cc2f9ad08c2ffa831f21fc4fdd9dcb0cb" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.190677 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="probe" containerID="cri-o://47ad10e192eab044a506b16f61b3d058fe52c5d53ead1431411be3904fe2b71f" gracePeriod=30 Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.204700 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.253628 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68775b9c9d-w9j9w" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.331411 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57974449fd-mzjh9"] Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.331904 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" containerID="cri-o://3b4dde763ede66ee6751d9c3417eeafb2e11c589d3a8e4147ad97c4eefedba14" gracePeriod=30 Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.332633 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" containerID="cri-o://4ff8295efe4b28b1dea15510201a5538d89fa33809686080fcb5724567d00e06" gracePeriod=30 Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.333895 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-svc\") pod \"effa400c-2afb-4abd-a644-6c15cdad3ee1\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.333929 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-sb\") pod \"effa400c-2afb-4abd-a644-6c15cdad3ee1\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.334070 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq6p8\" (UniqueName: \"kubernetes.io/projected/effa400c-2afb-4abd-a644-6c15cdad3ee1-kube-api-access-gq6p8\") pod \"effa400c-2afb-4abd-a644-6c15cdad3ee1\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.334100 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-nb\") pod \"effa400c-2afb-4abd-a644-6c15cdad3ee1\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.334129 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-swift-storage-0\") pod \"effa400c-2afb-4abd-a644-6c15cdad3ee1\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.334208 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-config\") pod \"effa400c-2afb-4abd-a644-6c15cdad3ee1\" (UID: \"effa400c-2afb-4abd-a644-6c15cdad3ee1\") " Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.341658 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effa400c-2afb-4abd-a644-6c15cdad3ee1-kube-api-access-gq6p8" (OuterVolumeSpecName: "kube-api-access-gq6p8") pod "effa400c-2afb-4abd-a644-6c15cdad3ee1" (UID: "effa400c-2afb-4abd-a644-6c15cdad3ee1"). InnerVolumeSpecName "kube-api-access-gq6p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.358298 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.358477 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": EOF" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.419807 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-config" (OuterVolumeSpecName: "config") pod "effa400c-2afb-4abd-a644-6c15cdad3ee1" (UID: "effa400c-2afb-4abd-a644-6c15cdad3ee1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.420892 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "effa400c-2afb-4abd-a644-6c15cdad3ee1" (UID: "effa400c-2afb-4abd-a644-6c15cdad3ee1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.423991 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "effa400c-2afb-4abd-a644-6c15cdad3ee1" (UID: "effa400c-2afb-4abd-a644-6c15cdad3ee1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.446154 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq6p8\" (UniqueName: \"kubernetes.io/projected/effa400c-2afb-4abd-a644-6c15cdad3ee1-kube-api-access-gq6p8\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.446197 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.446212 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.446223 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.446967 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "effa400c-2afb-4abd-a644-6c15cdad3ee1" (UID: "effa400c-2afb-4abd-a644-6c15cdad3ee1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.493505 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "effa400c-2afb-4abd-a644-6c15cdad3ee1" (UID: "effa400c-2afb-4abd-a644-6c15cdad3ee1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.548917 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4849]: I0320 13:45:09.549200 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/effa400c-2afb-4abd-a644-6c15cdad3ee1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.206529 4849 generic.go:334] "Generic (PLEG): container finished" podID="0f41b707-ab00-42a2-9472-cd761733addc" containerID="47ad10e192eab044a506b16f61b3d058fe52c5d53ead1431411be3904fe2b71f" exitCode=0 Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.206593 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f41b707-ab00-42a2-9472-cd761733addc","Type":"ContainerDied","Data":"47ad10e192eab044a506b16f61b3d058fe52c5d53ead1431411be3904fe2b71f"} Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.208464 4849 generic.go:334] "Generic (PLEG): container finished" podID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerID="3b4dde763ede66ee6751d9c3417eeafb2e11c589d3a8e4147ad97c4eefedba14" exitCode=143 Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.208508 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57974449fd-mzjh9" event={"ID":"5ca49bbd-dd90-4eee-bd44-59934f7d757e","Type":"ContainerDied","Data":"3b4dde763ede66ee6751d9c3417eeafb2e11c589d3a8e4147ad97c4eefedba14"} Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.216281 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c45c13e7-1cf2-4e2a-993b-83f1aa9428cb","Type":"ContainerStarted","Data":"3ec06683d7349a3f5ef492a4a54839f5a3e66465306ed515fa130d3074c7dc8d"} Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.216359 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-h6lmv" Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.216736 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.239659 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.239642212 podStartE2EDuration="3.239642212s" podCreationTimestamp="2026-03-20 13:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:10.235259916 +0000 UTC m=+1259.912983311" watchObservedRunningTime="2026-03-20 13:45:10.239642212 +0000 UTC m=+1259.917365607" Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.280683 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h6lmv"] Mar 20 13:45:10 crc kubenswrapper[4849]: I0320 13:45:10.316261 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-h6lmv"] Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.053419 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" path="/var/lib/kubelet/pods/effa400c-2afb-4abd-a644-6c15cdad3ee1/volumes" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.094780 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.108424 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.226861 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerStarted","Data":"4878da155ec2f3f677ded54437c538a6498729d1b961f609d77d96527a8a3812"} Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.227218 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.228698 4849 generic.go:334] "Generic (PLEG): container finished" podID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerID="717f03af8e71b4627fc46deedfa0ff5f51c7251a3525a9f6fbfab7cf39df9d1d" exitCode=0 Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.228791 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68899bcb64-snjqk" event={"ID":"4623c171-dfb8-42e6-9038-a95ed2871b75","Type":"ContainerDied","Data":"717f03af8e71b4627fc46deedfa0ff5f51c7251a3525a9f6fbfab7cf39df9d1d"} Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.254411 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.755805848 podStartE2EDuration="10.254391877s" podCreationTimestamp="2026-03-20 13:45:01 +0000 UTC" firstStartedPulling="2026-03-20 13:45:03.483512772 +0000 UTC m=+1253.161236157" lastFinishedPulling="2026-03-20 13:45:09.982098791 +0000 UTC m=+1259.659822186" observedRunningTime="2026-03-20 13:45:11.251366367 +0000 UTC m=+1260.929089762" watchObservedRunningTime="2026-03-20 13:45:11.254391877 +0000 UTC m=+1260.932115272" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.423032 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f89c68c76-ng5km"] Mar 20 13:45:11 crc kubenswrapper[4849]: E0320 13:45:11.423441 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerName="init" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.423458 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerName="init" Mar 20 13:45:11 crc kubenswrapper[4849]: E0320 13:45:11.423475 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerName="dnsmasq-dns" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.423482 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerName="dnsmasq-dns" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.423681 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="effa400c-2afb-4abd-a644-6c15cdad3ee1" containerName="dnsmasq-dns" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.424884 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.437665 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f89c68c76-ng5km"] Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483435 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717601ae-c668-47e5-8d74-a9cb9cf8e940-logs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483613 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-combined-ca-bundle\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483726 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzw5\" (UniqueName: \"kubernetes.io/projected/717601ae-c668-47e5-8d74-a9cb9cf8e940-kube-api-access-grzw5\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483795 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-scripts\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483915 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-config-data\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483944 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-internal-tls-certs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.483972 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-public-tls-certs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.570587 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68899bcb64-snjqk" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586101 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717601ae-c668-47e5-8d74-a9cb9cf8e940-logs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586170 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-combined-ca-bundle\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586213 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzw5\" (UniqueName: \"kubernetes.io/projected/717601ae-c668-47e5-8d74-a9cb9cf8e940-kube-api-access-grzw5\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586245 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-scripts\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586283 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-config-data\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586300 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-internal-tls-certs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586318 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-public-tls-certs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.586633 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/717601ae-c668-47e5-8d74-a9cb9cf8e940-logs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.592743 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-public-tls-certs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.594239 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-scripts\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.595556 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-internal-tls-certs\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.603884 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-config-data\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.604182 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzw5\" (UniqueName: \"kubernetes.io/projected/717601ae-c668-47e5-8d74-a9cb9cf8e940-kube-api-access-grzw5\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.613012 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717601ae-c668-47e5-8d74-a9cb9cf8e940-combined-ca-bundle\") pod \"placement-5f89c68c76-ng5km\" (UID: \"717601ae-c668-47e5-8d74-a9cb9cf8e940\") " pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:11 crc kubenswrapper[4849]: I0320 13:45:11.755239 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:12 crc kubenswrapper[4849]: I0320 13:45:12.341564 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f89c68c76-ng5km"] Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.248037 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f89c68c76-ng5km" event={"ID":"717601ae-c668-47e5-8d74-a9cb9cf8e940","Type":"ContainerStarted","Data":"10b79eb1fbb6df3be048c4d0b6adb13c000038e9e8096eeae3279e34f3d8d6c0"} Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.248479 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.248495 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f89c68c76-ng5km" event={"ID":"717601ae-c668-47e5-8d74-a9cb9cf8e940","Type":"ContainerStarted","Data":"b961debc160289a91ba70d0050d21118023771e4516bb346bf00c138df4bee05"} Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.248505 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f89c68c76-ng5km" event={"ID":"717601ae-c668-47e5-8d74-a9cb9cf8e940","Type":"ContainerStarted","Data":"c015f5be45bbf142d168973e7eff7934e205e0de1263bfefa6bd35a8dc3e04df"} Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.248720 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.278433 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f89c68c76-ng5km" podStartSLOduration=2.278416871 podStartE2EDuration="2.278416871s" podCreationTimestamp="2026-03-20 13:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:13.277472596 +0000 UTC m=+1262.955196021" watchObservedRunningTime="2026-03-20 13:45:13.278416871 +0000 UTC m=+1262.956140266" Mar 20 13:45:13 crc kubenswrapper[4849]: I0320 13:45:13.863345 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d5f885888-6vtg6" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.259041 4849 generic.go:334] "Generic (PLEG): container finished" podID="0f41b707-ab00-42a2-9472-cd761733addc" containerID="76caa296c758a2d6f13892e5c663cd03a8abba4518249fa55ad4c3ca8bf083a5" exitCode=0 Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.259112 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f41b707-ab00-42a2-9472-cd761733addc","Type":"ContainerDied","Data":"76caa296c758a2d6f13892e5c663cd03a8abba4518249fa55ad4c3ca8bf083a5"} Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.259163 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0f41b707-ab00-42a2-9472-cd761733addc","Type":"ContainerDied","Data":"067a2dae943ba4635a5ebeeae05b8aa787064218aa8b02bcf08912a45a08151b"} Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.259182 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="067a2dae943ba4635a5ebeeae05b8aa787064218aa8b02bcf08912a45a08151b" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.293434 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.343719 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data\") pod \"0f41b707-ab00-42a2-9472-cd761733addc\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.343806 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-combined-ca-bundle\") pod \"0f41b707-ab00-42a2-9472-cd761733addc\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.343866 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data-custom\") pod \"0f41b707-ab00-42a2-9472-cd761733addc\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.343892 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f41b707-ab00-42a2-9472-cd761733addc-etc-machine-id\") pod \"0f41b707-ab00-42a2-9472-cd761733addc\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.343916 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-scripts\") pod \"0f41b707-ab00-42a2-9472-cd761733addc\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.343986 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2z6m\" (UniqueName: \"kubernetes.io/projected/0f41b707-ab00-42a2-9472-cd761733addc-kube-api-access-g2z6m\") pod \"0f41b707-ab00-42a2-9472-cd761733addc\" (UID: \"0f41b707-ab00-42a2-9472-cd761733addc\") " Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.344515 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f41b707-ab00-42a2-9472-cd761733addc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f41b707-ab00-42a2-9472-cd761733addc" (UID: "0f41b707-ab00-42a2-9472-cd761733addc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.344918 4849 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f41b707-ab00-42a2-9472-cd761733addc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.350361 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f41b707-ab00-42a2-9472-cd761733addc-kube-api-access-g2z6m" (OuterVolumeSpecName: "kube-api-access-g2z6m") pod "0f41b707-ab00-42a2-9472-cd761733addc" (UID: "0f41b707-ab00-42a2-9472-cd761733addc"). InnerVolumeSpecName "kube-api-access-g2z6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.350678 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f41b707-ab00-42a2-9472-cd761733addc" (UID: "0f41b707-ab00-42a2-9472-cd761733addc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.352327 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-scripts" (OuterVolumeSpecName: "scripts") pod "0f41b707-ab00-42a2-9472-cd761733addc" (UID: "0f41b707-ab00-42a2-9472-cd761733addc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.402934 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f41b707-ab00-42a2-9472-cd761733addc" (UID: "0f41b707-ab00-42a2-9472-cd761733addc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.446723 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.446751 4849 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.446763 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.446771 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2z6m\" (UniqueName: \"kubernetes.io/projected/0f41b707-ab00-42a2-9472-cd761733addc-kube-api-access-g2z6m\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.463288 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data" (OuterVolumeSpecName: "config-data") pod "0f41b707-ab00-42a2-9472-cd761733addc" (UID: "0f41b707-ab00-42a2-9472-cd761733addc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.547062 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.547559 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.548085 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f41b707-ab00-42a2-9472-cd761733addc-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.765648 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:34584->10.217.0.164:9311: read: connection reset by peer" Mar 20 13:45:14 crc kubenswrapper[4849]: I0320 13:45:14.765685 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-57974449fd-mzjh9" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:34580->10.217.0.164:9311: read: connection reset by peer" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.270129 4849 generic.go:334] "Generic (PLEG): container finished" podID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerID="4ff8295efe4b28b1dea15510201a5538d89fa33809686080fcb5724567d00e06" exitCode=0 Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.270223 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.271292 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57974449fd-mzjh9" event={"ID":"5ca49bbd-dd90-4eee-bd44-59934f7d757e","Type":"ContainerDied","Data":"4ff8295efe4b28b1dea15510201a5538d89fa33809686080fcb5724567d00e06"} Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.308022 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.347615 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.363014 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4849]: E0320 13:45:15.363353 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="probe" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.363367 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="probe" Mar 20 13:45:15 crc kubenswrapper[4849]: E0320 13:45:15.363391 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="cinder-scheduler" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.363397 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="cinder-scheduler" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.363586 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="cinder-scheduler" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.363612 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f41b707-ab00-42a2-9472-cd761733addc" containerName="probe" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.368878 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.373080 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.374358 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.472132 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.472614 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.472692 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.472725 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.472764 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hbwn\" (UniqueName: \"kubernetes.io/projected/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-kube-api-access-2hbwn\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.472783 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.490366 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574057 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-combined-ca-bundle\") pod \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574113 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ca49bbd-dd90-4eee-bd44-59934f7d757e-logs\") pod \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574193 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data-custom\") pod \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574248 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data\") pod \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574321 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwnv9\" (UniqueName: \"kubernetes.io/projected/5ca49bbd-dd90-4eee-bd44-59934f7d757e-kube-api-access-bwnv9\") pod \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\" (UID: \"5ca49bbd-dd90-4eee-bd44-59934f7d757e\") " Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574607 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hbwn\" (UniqueName: \"kubernetes.io/projected/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-kube-api-access-2hbwn\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574645 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574701 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574746 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574813 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.574875 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.580650 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.581375 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ca49bbd-dd90-4eee-bd44-59934f7d757e-logs" (OuterVolumeSpecName: "logs") pod "5ca49bbd-dd90-4eee-bd44-59934f7d757e" (UID: "5ca49bbd-dd90-4eee-bd44-59934f7d757e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.581538 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca49bbd-dd90-4eee-bd44-59934f7d757e-kube-api-access-bwnv9" (OuterVolumeSpecName: "kube-api-access-bwnv9") pod "5ca49bbd-dd90-4eee-bd44-59934f7d757e" (UID: "5ca49bbd-dd90-4eee-bd44-59934f7d757e"). InnerVolumeSpecName "kube-api-access-bwnv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.582343 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.582389 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.586010 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5ca49bbd-dd90-4eee-bd44-59934f7d757e" (UID: "5ca49bbd-dd90-4eee-bd44-59934f7d757e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.594304 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-scripts\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.596572 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-config-data\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.600287 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hbwn\" (UniqueName: \"kubernetes.io/projected/2ff8045e-a7ad-400e-9e02-cbc7dc3a248c-kube-api-access-2hbwn\") pod \"cinder-scheduler-0\" (UID: \"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c\") " pod="openstack/cinder-scheduler-0" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.607633 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ca49bbd-dd90-4eee-bd44-59934f7d757e" (UID: "5ca49bbd-dd90-4eee-bd44-59934f7d757e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.638997 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data" (OuterVolumeSpecName: "config-data") pod "5ca49bbd-dd90-4eee-bd44-59934f7d757e" (UID: "5ca49bbd-dd90-4eee-bd44-59934f7d757e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.681563 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.681619 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ca49bbd-dd90-4eee-bd44-59934f7d757e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.681632 4849 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.681642 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ca49bbd-dd90-4eee-bd44-59934f7d757e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.681652 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwnv9\" (UniqueName: \"kubernetes.io/projected/5ca49bbd-dd90-4eee-bd44-59934f7d757e-kube-api-access-bwnv9\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:15 crc kubenswrapper[4849]: I0320 13:45:15.693534 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:45:16 crc kubenswrapper[4849]: W0320 13:45:16.174665 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff8045e_a7ad_400e_9e02_cbc7dc3a248c.slice/crio-0665f74c4fd26cec6cfc99a01220895abe731cee07de9e8ecde8ceb981fb96bd WatchSource:0}: Error finding container 0665f74c4fd26cec6cfc99a01220895abe731cee07de9e8ecde8ceb981fb96bd: Status 404 returned error can't find the container with id 0665f74c4fd26cec6cfc99a01220895abe731cee07de9e8ecde8ceb981fb96bd Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.175397 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.279577 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57974449fd-mzjh9" Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.279573 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57974449fd-mzjh9" event={"ID":"5ca49bbd-dd90-4eee-bd44-59934f7d757e","Type":"ContainerDied","Data":"88a398b40587a0158e1a29b3015c80622a46c13555524c7d59d45b27c499cbb0"} Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.279636 4849 scope.go:117] "RemoveContainer" containerID="4ff8295efe4b28b1dea15510201a5538d89fa33809686080fcb5724567d00e06" Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.281204 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c","Type":"ContainerStarted","Data":"0665f74c4fd26cec6cfc99a01220895abe731cee07de9e8ecde8ceb981fb96bd"} Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.320802 4849 scope.go:117] "RemoveContainer" containerID="3b4dde763ede66ee6751d9c3417eeafb2e11c589d3a8e4147ad97c4eefedba14" Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.328889 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-57974449fd-mzjh9"] Mar 20 13:45:16 crc kubenswrapper[4849]: I0320 13:45:16.335951 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-57974449fd-mzjh9"] Mar 20 13:45:17 crc kubenswrapper[4849]: I0320 13:45:17.050171 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f41b707-ab00-42a2-9472-cd761733addc" path="/var/lib/kubelet/pods/0f41b707-ab00-42a2-9472-cd761733addc/volumes" Mar 20 13:45:17 crc kubenswrapper[4849]: I0320 13:45:17.051628 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" path="/var/lib/kubelet/pods/5ca49bbd-dd90-4eee-bd44-59934f7d757e/volumes" Mar 20 13:45:17 crc kubenswrapper[4849]: I0320 13:45:17.290531 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c","Type":"ContainerStarted","Data":"65831577ae6daaf0f4b40caa12dd9c9f7c3801382a1b3e5cc69604ba0d4e2efc"} Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.301715 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2ff8045e-a7ad-400e-9e02-cbc7dc3a248c","Type":"ContainerStarted","Data":"6e742d87439d559229f7a7efb2d33066bc2b21b3aa51de069e49d7fc9cba05ce"} Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.329809 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.3297859020000002 podStartE2EDuration="3.329785902s" podCreationTimestamp="2026-03-20 13:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:18.322417457 +0000 UTC m=+1268.000140872" watchObservedRunningTime="2026-03-20 13:45:18.329785902 +0000 UTC m=+1268.007509297" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.972076 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:18 crc kubenswrapper[4849]: E0320 13:45:18.972758 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.972783 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" Mar 20 13:45:18 crc kubenswrapper[4849]: E0320 13:45:18.972811 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.972837 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.973028 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api-log" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.973067 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca49bbd-dd90-4eee-bd44-59934f7d757e" containerName="barbican-api" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.973590 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.975878 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cpq5n" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.976893 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.976961 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 13:45:18 crc kubenswrapper[4849]: I0320 13:45:18.996918 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.046441 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config-secret\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.046539 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.046617 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8rm\" (UniqueName: \"kubernetes.io/projected/101c8a9b-0e13-45bb-8aaf-131590e60137-kube-api-access-tq8rm\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.046652 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.086631 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.087129 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-notification-agent" containerID="cri-o://5edbe1ef1aa6533e60349eea9ee2f7764f5cd71015c2a31a495f6aedb801a7f3" gracePeriod=30 Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.087121 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="sg-core" containerID="cri-o://93fa7757d63ec3636c95a1fa1e1a6545841f3e38217c7845960e3239eeaccbb1" gracePeriod=30 Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.087307 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="proxy-httpd" containerID="cri-o://4878da155ec2f3f677ded54437c538a6498729d1b961f609d77d96527a8a3812" gracePeriod=30 Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.087456 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-central-agent" containerID="cri-o://5c8ee983feef9979881c73e49772707c20c468ba347e3a33f0805a3e23087e6f" gracePeriod=30 Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.148610 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq8rm\" (UniqueName: \"kubernetes.io/projected/101c8a9b-0e13-45bb-8aaf-131590e60137-kube-api-access-tq8rm\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.148707 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.148783 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config-secret\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.148889 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.150728 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.156466 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.165652 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config-secret\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.167452 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq8rm\" (UniqueName: \"kubernetes.io/projected/101c8a9b-0e13-45bb-8aaf-131590e60137-kube-api-access-tq8rm\") pod \"openstackclient\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.205491 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.206234 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.225858 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.242600 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.243948 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.252432 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.336010 4849 generic.go:334] "Generic (PLEG): container finished" podID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerID="4878da155ec2f3f677ded54437c538a6498729d1b961f609d77d96527a8a3812" exitCode=0 Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.336288 4849 generic.go:334] "Generic (PLEG): container finished" podID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerID="93fa7757d63ec3636c95a1fa1e1a6545841f3e38217c7845960e3239eeaccbb1" exitCode=2 Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.336050 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerDied","Data":"4878da155ec2f3f677ded54437c538a6498729d1b961f609d77d96527a8a3812"} Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.336374 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerDied","Data":"93fa7757d63ec3636c95a1fa1e1a6545841f3e38217c7845960e3239eeaccbb1"} Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.351668 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6c971ba-2837-43ce-900d-c28d956ab162-openstack-config\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.351856 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf2q4\" (UniqueName: \"kubernetes.io/projected/d6c971ba-2837-43ce-900d-c28d956ab162-kube-api-access-mf2q4\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.351939 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c971ba-2837-43ce-900d-c28d956ab162-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.352242 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6c971ba-2837-43ce-900d-c28d956ab162-openstack-config-secret\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: E0320 13:45:19.377347 4849 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:45:19 crc kubenswrapper[4849]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_101c8a9b-0e13-45bb-8aaf-131590e60137_0(93f6cb482fd35abe38e32bf4f2077d2c5369c4c9d33718b7f08987debaf719d7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"93f6cb482fd35abe38e32bf4f2077d2c5369c4c9d33718b7f08987debaf719d7" Netns:"/var/run/netns/1ab4c94f-19de-42d2-a6cb-db7de3467e0e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=93f6cb482fd35abe38e32bf4f2077d2c5369c4c9d33718b7f08987debaf719d7;K8S_POD_UID=101c8a9b-0e13-45bb-8aaf-131590e60137" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/101c8a9b-0e13-45bb-8aaf-131590e60137]: expected pod UID "101c8a9b-0e13-45bb-8aaf-131590e60137" but got "d6c971ba-2837-43ce-900d-c28d956ab162" from Kube API Mar 20 13:45:19 crc kubenswrapper[4849]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:45:19 crc kubenswrapper[4849]: > Mar 20 13:45:19 crc kubenswrapper[4849]: E0320 13:45:19.377414 4849 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:45:19 crc kubenswrapper[4849]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_101c8a9b-0e13-45bb-8aaf-131590e60137_0(93f6cb482fd35abe38e32bf4f2077d2c5369c4c9d33718b7f08987debaf719d7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"93f6cb482fd35abe38e32bf4f2077d2c5369c4c9d33718b7f08987debaf719d7" Netns:"/var/run/netns/1ab4c94f-19de-42d2-a6cb-db7de3467e0e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=93f6cb482fd35abe38e32bf4f2077d2c5369c4c9d33718b7f08987debaf719d7;K8S_POD_UID=101c8a9b-0e13-45bb-8aaf-131590e60137" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/101c8a9b-0e13-45bb-8aaf-131590e60137]: expected pod UID "101c8a9b-0e13-45bb-8aaf-131590e60137" but got "d6c971ba-2837-43ce-900d-c28d956ab162" from Kube API Mar 20 13:45:19 crc kubenswrapper[4849]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:45:19 crc kubenswrapper[4849]: > pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.453510 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6c971ba-2837-43ce-900d-c28d956ab162-openstack-config-secret\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.453615 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6c971ba-2837-43ce-900d-c28d956ab162-openstack-config\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.453709 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf2q4\" (UniqueName: \"kubernetes.io/projected/d6c971ba-2837-43ce-900d-c28d956ab162-kube-api-access-mf2q4\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.453775 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c971ba-2837-43ce-900d-c28d956ab162-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.455980 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d6c971ba-2837-43ce-900d-c28d956ab162-openstack-config\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.458717 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d6c971ba-2837-43ce-900d-c28d956ab162-openstack-config-secret\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.462773 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c971ba-2837-43ce-900d-c28d956ab162-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.472295 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf2q4\" (UniqueName: \"kubernetes.io/projected/d6c971ba-2837-43ce-900d-c28d956ab162-kube-api-access-mf2q4\") pod \"openstackclient\" (UID: \"d6c971ba-2837-43ce-900d-c28d956ab162\") " pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.610734 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.700829 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.791595 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-74f54cb475-dnvws"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.793435 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.802886 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.803102 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.803250 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.807097 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74f54cb475-dnvws"] Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863354 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-log-httpd\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863710 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-etc-swift\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863750 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-internal-tls-certs\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863802 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-public-tls-certs\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863841 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-combined-ca-bundle\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863874 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-config-data\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863894 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j767h\" (UniqueName: \"kubernetes.io/projected/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-kube-api-access-j767h\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.863913 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-run-httpd\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965523 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-internal-tls-certs\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965581 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-public-tls-certs\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965599 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-combined-ca-bundle\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965625 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j767h\" (UniqueName: \"kubernetes.io/projected/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-kube-api-access-j767h\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965641 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-config-data\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965656 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-run-httpd\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965702 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-log-httpd\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.965878 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-etc-swift\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.966350 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-run-httpd\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.966721 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-log-httpd\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.973013 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-config-data\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.973021 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-public-tls-certs\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.973675 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-internal-tls-certs\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.973992 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-etc-swift\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.976499 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-combined-ca-bundle\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:19 crc kubenswrapper[4849]: I0320 13:45:19.985331 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j767h\" (UniqueName: \"kubernetes.io/projected/ccb7bb85-a9e1-4de8-83b1-081dd02455b9-kube-api-access-j767h\") pod \"swift-proxy-74f54cb475-dnvws\" (UID: \"ccb7bb85-a9e1-4de8-83b1-081dd02455b9\") " pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.118564 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.212642 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:45:20 crc kubenswrapper[4849]: W0320 13:45:20.229478 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6c971ba_2837_43ce_900d_c28d956ab162.slice/crio-fca65617d2712bd90a9743db209acc04ac062dd690be7a2d19926346be5d0424 WatchSource:0}: Error finding container fca65617d2712bd90a9743db209acc04ac062dd690be7a2d19926346be5d0424: Status 404 returned error can't find the container with id fca65617d2712bd90a9743db209acc04ac062dd690be7a2d19926346be5d0424 Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.348737 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d6c971ba-2837-43ce-900d-c28d956ab162","Type":"ContainerStarted","Data":"fca65617d2712bd90a9743db209acc04ac062dd690be7a2d19926346be5d0424"} Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.351714 4849 generic.go:334] "Generic (PLEG): container finished" podID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerID="5edbe1ef1aa6533e60349eea9ee2f7764f5cd71015c2a31a495f6aedb801a7f3" exitCode=0 Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.351742 4849 generic.go:334] "Generic (PLEG): container finished" podID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerID="5c8ee983feef9979881c73e49772707c20c468ba347e3a33f0805a3e23087e6f" exitCode=0 Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.351787 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.352309 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerDied","Data":"5edbe1ef1aa6533e60349eea9ee2f7764f5cd71015c2a31a495f6aedb801a7f3"} Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.352344 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerDied","Data":"5c8ee983feef9979881c73e49772707c20c468ba347e3a33f0805a3e23087e6f"} Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.355847 4849 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="101c8a9b-0e13-45bb-8aaf-131590e60137" podUID="d6c971ba-2837-43ce-900d-c28d956ab162" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.363151 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.474714 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq8rm\" (UniqueName: \"kubernetes.io/projected/101c8a9b-0e13-45bb-8aaf-131590e60137-kube-api-access-tq8rm\") pod \"101c8a9b-0e13-45bb-8aaf-131590e60137\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.474887 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config\") pod \"101c8a9b-0e13-45bb-8aaf-131590e60137\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.474910 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config-secret\") pod \"101c8a9b-0e13-45bb-8aaf-131590e60137\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.474987 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-combined-ca-bundle\") pod \"101c8a9b-0e13-45bb-8aaf-131590e60137\" (UID: \"101c8a9b-0e13-45bb-8aaf-131590e60137\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.475416 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "101c8a9b-0e13-45bb-8aaf-131590e60137" (UID: "101c8a9b-0e13-45bb-8aaf-131590e60137"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.479583 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/101c8a9b-0e13-45bb-8aaf-131590e60137-kube-api-access-tq8rm" (OuterVolumeSpecName: "kube-api-access-tq8rm") pod "101c8a9b-0e13-45bb-8aaf-131590e60137" (UID: "101c8a9b-0e13-45bb-8aaf-131590e60137"). InnerVolumeSpecName "kube-api-access-tq8rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.480296 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "101c8a9b-0e13-45bb-8aaf-131590e60137" (UID: "101c8a9b-0e13-45bb-8aaf-131590e60137"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.485936 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "101c8a9b-0e13-45bb-8aaf-131590e60137" (UID: "101c8a9b-0e13-45bb-8aaf-131590e60137"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.577143 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq8rm\" (UniqueName: \"kubernetes.io/projected/101c8a9b-0e13-45bb-8aaf-131590e60137-kube-api-access-tq8rm\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.577178 4849 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.577189 4849 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.577198 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/101c8a9b-0e13-45bb-8aaf-131590e60137-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.676289 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.694414 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74f54cb475-dnvws"] Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.694531 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.780678 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-scripts\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.780782 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-log-httpd\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.780807 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-sg-core-conf-yaml\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.780892 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfnnh\" (UniqueName: \"kubernetes.io/projected/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-kube-api-access-bfnnh\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.780927 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-combined-ca-bundle\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.781001 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-config-data\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.781070 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-run-httpd\") pod \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\" (UID: \"fa6ef88a-3c76-4f00-aaaf-6555548d9c26\") " Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.781513 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.781803 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.782311 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.787286 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-kube-api-access-bfnnh" (OuterVolumeSpecName: "kube-api-access-bfnnh") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "kube-api-access-bfnnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.789998 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-scripts" (OuterVolumeSpecName: "scripts") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.825505 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.876325 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.884731 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.884761 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.884771 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfnnh\" (UniqueName: \"kubernetes.io/projected/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-kube-api-access-bfnnh\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.884779 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.884787 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.909101 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-config-data" (OuterVolumeSpecName: "config-data") pod "fa6ef88a-3c76-4f00-aaaf-6555548d9c26" (UID: "fa6ef88a-3c76-4f00-aaaf-6555548d9c26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:20 crc kubenswrapper[4849]: I0320 13:45:20.986883 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6ef88a-3c76-4f00-aaaf-6555548d9c26-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.052344 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="101c8a9b-0e13-45bb-8aaf-131590e60137" path="/var/lib/kubelet/pods/101c8a9b-0e13-45bb-8aaf-131590e60137/volumes" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.370168 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa6ef88a-3c76-4f00-aaaf-6555548d9c26","Type":"ContainerDied","Data":"0b2df2e637d33a9ac3775e67fe1cb3353181c19587c1d52030a7f2072d3be7a6"} Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.370531 4849 scope.go:117] "RemoveContainer" containerID="4878da155ec2f3f677ded54437c538a6498729d1b961f609d77d96527a8a3812" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.370783 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.384305 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.384305 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74f54cb475-dnvws" event={"ID":"ccb7bb85-a9e1-4de8-83b1-081dd02455b9","Type":"ContainerStarted","Data":"92805574c75e9758a15eba89274424d5de0fb972c7459f6dce3c963d7bae157b"} Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.391179 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74f54cb475-dnvws" event={"ID":"ccb7bb85-a9e1-4de8-83b1-081dd02455b9","Type":"ContainerStarted","Data":"741b7e615b3856f0814558ea88218f6b1bce8467ec51ea435f1cdccba1e73122"} Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.407462 4849 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="101c8a9b-0e13-45bb-8aaf-131590e60137" podUID="d6c971ba-2837-43ce-900d-c28d956ab162" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.408613 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.419177 4849 scope.go:117] "RemoveContainer" containerID="93fa7757d63ec3636c95a1fa1e1a6545841f3e38217c7845960e3239eeaccbb1" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.438738 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.457005 4849 scope.go:117] "RemoveContainer" containerID="5edbe1ef1aa6533e60349eea9ee2f7764f5cd71015c2a31a495f6aedb801a7f3" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.460941 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:21 crc kubenswrapper[4849]: E0320 13:45:21.461349 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="sg-core" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461361 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="sg-core" Mar 20 13:45:21 crc kubenswrapper[4849]: E0320 13:45:21.461373 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-central-agent" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461381 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-central-agent" Mar 20 13:45:21 crc kubenswrapper[4849]: E0320 13:45:21.461406 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="proxy-httpd" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461413 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="proxy-httpd" Mar 20 13:45:21 crc kubenswrapper[4849]: E0320 13:45:21.461435 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-notification-agent" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461441 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-notification-agent" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461599 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-central-agent" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461613 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="proxy-httpd" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461624 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="ceilometer-notification-agent" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.461637 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" containerName="sg-core" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.463268 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.465849 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.466115 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.468891 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.484242 4849 scope.go:117] "RemoveContainer" containerID="5c8ee983feef9979881c73e49772707c20c468ba347e3a33f0805a3e23087e6f" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.495738 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-log-httpd\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.495797 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-config-data\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.495965 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-run-httpd\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.495981 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.496002 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b568t\" (UniqueName: \"kubernetes.io/projected/99613c42-8bb0-416a-99ee-22c349eaf052-kube-api-access-b568t\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.496026 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.496094 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-scripts\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.569733 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68899bcb64-snjqk" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.597932 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.597983 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-run-httpd\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.598011 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b568t\" (UniqueName: \"kubernetes.io/projected/99613c42-8bb0-416a-99ee-22c349eaf052-kube-api-access-b568t\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.598047 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.598088 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-scripts\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.598133 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-log-httpd\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.598171 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-config-data\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.600112 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-run-httpd\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.604116 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-log-httpd\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.606730 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-scripts\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.606891 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.608163 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-config-data\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.608590 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.616670 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b568t\" (UniqueName: \"kubernetes.io/projected/99613c42-8bb0-416a-99ee-22c349eaf052-kube-api-access-b568t\") pod \"ceilometer-0\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " pod="openstack/ceilometer-0" Mar 20 13:45:21 crc kubenswrapper[4849]: I0320 13:45:21.783697 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:22 crc kubenswrapper[4849]: I0320 13:45:22.394882 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74f54cb475-dnvws" event={"ID":"ccb7bb85-a9e1-4de8-83b1-081dd02455b9","Type":"ContainerStarted","Data":"107f82de8216d888d3013e236c63eb5299564e70a41e12a430d3796c045b4ff3"} Mar 20 13:45:22 crc kubenswrapper[4849]: I0320 13:45:22.395219 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:22 crc kubenswrapper[4849]: I0320 13:45:22.395239 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:22 crc kubenswrapper[4849]: I0320 13:45:22.418860 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:22 crc kubenswrapper[4849]: I0320 13:45:22.420463 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-74f54cb475-dnvws" podStartSLOduration=3.420442439 podStartE2EDuration="3.420442439s" podCreationTimestamp="2026-03-20 13:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:22.413182347 +0000 UTC m=+1272.090905732" watchObservedRunningTime="2026-03-20 13:45:22.420442439 +0000 UTC m=+1272.098165834" Mar 20 13:45:22 crc kubenswrapper[4849]: W0320 13:45:22.439913 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99613c42_8bb0_416a_99ee_22c349eaf052.slice/crio-f76b1229ffbcc2c78b2e628dd4bcc4064a5bb8339f38f6358e1541fb5a34397d WatchSource:0}: Error finding container f76b1229ffbcc2c78b2e628dd4bcc4064a5bb8339f38f6358e1541fb5a34397d: Status 404 returned error can't find the container with id f76b1229ffbcc2c78b2e628dd4bcc4064a5bb8339f38f6358e1541fb5a34397d Mar 20 13:45:23 crc kubenswrapper[4849]: I0320 13:45:23.053257 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6ef88a-3c76-4f00-aaaf-6555548d9c26" path="/var/lib/kubelet/pods/fa6ef88a-3c76-4f00-aaaf-6555548d9c26/volumes" Mar 20 13:45:23 crc kubenswrapper[4849]: I0320 13:45:23.408171 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerStarted","Data":"f76b1229ffbcc2c78b2e628dd4bcc4064a5bb8339f38f6358e1541fb5a34397d"} Mar 20 13:45:25 crc kubenswrapper[4849]: I0320 13:45:25.129423 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:25 crc kubenswrapper[4849]: I0320 13:45:25.949689 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:45:28 crc kubenswrapper[4849]: I0320 13:45:28.013563 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:30 crc kubenswrapper[4849]: I0320 13:45:30.131649 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74f54cb475-dnvws" Mar 20 13:45:30 crc kubenswrapper[4849]: I0320 13:45:30.485905 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d6c971ba-2837-43ce-900d-c28d956ab162","Type":"ContainerStarted","Data":"2facb8645122aaae7eccd8b73a0ab49d57400881106bde28d65963b86e8ad823"} Mar 20 13:45:30 crc kubenswrapper[4849]: I0320 13:45:30.487543 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerStarted","Data":"a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d"} Mar 20 13:45:30 crc kubenswrapper[4849]: I0320 13:45:30.505517 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.635045114 podStartE2EDuration="11.505499513s" podCreationTimestamp="2026-03-20 13:45:19 +0000 UTC" firstStartedPulling="2026-03-20 13:45:20.236713295 +0000 UTC m=+1269.914436690" lastFinishedPulling="2026-03-20 13:45:30.107167694 +0000 UTC m=+1279.784891089" observedRunningTime="2026-03-20 13:45:30.505164554 +0000 UTC m=+1280.182887949" watchObservedRunningTime="2026-03-20 13:45:30.505499513 +0000 UTC m=+1280.183222908" Mar 20 13:45:31 crc kubenswrapper[4849]: I0320 13:45:31.499171 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerStarted","Data":"7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277"} Mar 20 13:45:31 crc kubenswrapper[4849]: I0320 13:45:31.569697 4849 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-68899bcb64-snjqk" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:45:31 crc kubenswrapper[4849]: I0320 13:45:31.569833 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:45:32 crc kubenswrapper[4849]: I0320 13:45:32.509503 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerStarted","Data":"c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f"} Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.077567 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-749b7fc4bf-nwzdf" Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.160482 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b4f644b5b-zlgdg"] Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.160731 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b4f644b5b-zlgdg" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-api" containerID="cri-o://227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67" gracePeriod=30 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.161148 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b4f644b5b-zlgdg" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-httpd" containerID="cri-o://4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6" gracePeriod=30 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.520419 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerStarted","Data":"740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2"} Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.520504 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-central-agent" containerID="cri-o://a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d" gracePeriod=30 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.520551 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.520581 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-notification-agent" containerID="cri-o://7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277" gracePeriod=30 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.520610 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="sg-core" containerID="cri-o://c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f" gracePeriod=30 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.520527 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="proxy-httpd" containerID="cri-o://740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2" gracePeriod=30 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.523581 4849 generic.go:334] "Generic (PLEG): container finished" podID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerID="4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6" exitCode=0 Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.523617 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f644b5b-zlgdg" event={"ID":"dfdf38eb-05df-43ec-acb7-258da19c432a","Type":"ContainerDied","Data":"4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6"} Mar 20 13:45:33 crc kubenswrapper[4849]: I0320 13:45:33.546519 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.838351096 podStartE2EDuration="12.546494891s" podCreationTimestamp="2026-03-20 13:45:21 +0000 UTC" firstStartedPulling="2026-03-20 13:45:22.44501247 +0000 UTC m=+1272.122735865" lastFinishedPulling="2026-03-20 13:45:33.153156265 +0000 UTC m=+1282.830879660" observedRunningTime="2026-03-20 13:45:33.540442101 +0000 UTC m=+1283.218165496" watchObservedRunningTime="2026-03-20 13:45:33.546494891 +0000 UTC m=+1283.224218296" Mar 20 13:45:34 crc kubenswrapper[4849]: I0320 13:45:34.537661 4849 generic.go:334] "Generic (PLEG): container finished" podID="99613c42-8bb0-416a-99ee-22c349eaf052" containerID="740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2" exitCode=0 Mar 20 13:45:34 crc kubenswrapper[4849]: I0320 13:45:34.537708 4849 generic.go:334] "Generic (PLEG): container finished" podID="99613c42-8bb0-416a-99ee-22c349eaf052" containerID="c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f" exitCode=2 Mar 20 13:45:34 crc kubenswrapper[4849]: I0320 13:45:34.537719 4849 generic.go:334] "Generic (PLEG): container finished" podID="99613c42-8bb0-416a-99ee-22c349eaf052" containerID="7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277" exitCode=0 Mar 20 13:45:34 crc kubenswrapper[4849]: I0320 13:45:34.537743 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerDied","Data":"740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2"} Mar 20 13:45:34 crc kubenswrapper[4849]: I0320 13:45:34.537773 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerDied","Data":"c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f"} Mar 20 13:45:34 crc kubenswrapper[4849]: I0320 13:45:34.537785 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerDied","Data":"7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277"} Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.359757 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.452700 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-config\") pod \"dfdf38eb-05df-43ec-acb7-258da19c432a\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.452798 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-httpd-config\") pod \"dfdf38eb-05df-43ec-acb7-258da19c432a\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.452864 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbv4s\" (UniqueName: \"kubernetes.io/projected/dfdf38eb-05df-43ec-acb7-258da19c432a-kube-api-access-bbv4s\") pod \"dfdf38eb-05df-43ec-acb7-258da19c432a\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.452908 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-combined-ca-bundle\") pod \"dfdf38eb-05df-43ec-acb7-258da19c432a\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.452930 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-ovndb-tls-certs\") pod \"dfdf38eb-05df-43ec-acb7-258da19c432a\" (UID: \"dfdf38eb-05df-43ec-acb7-258da19c432a\") " Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.459486 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "dfdf38eb-05df-43ec-acb7-258da19c432a" (UID: "dfdf38eb-05df-43ec-acb7-258da19c432a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.480118 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfdf38eb-05df-43ec-acb7-258da19c432a-kube-api-access-bbv4s" (OuterVolumeSpecName: "kube-api-access-bbv4s") pod "dfdf38eb-05df-43ec-acb7-258da19c432a" (UID: "dfdf38eb-05df-43ec-acb7-258da19c432a"). InnerVolumeSpecName "kube-api-access-bbv4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.504735 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfdf38eb-05df-43ec-acb7-258da19c432a" (UID: "dfdf38eb-05df-43ec-acb7-258da19c432a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.513311 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-config" (OuterVolumeSpecName: "config") pod "dfdf38eb-05df-43ec-acb7-258da19c432a" (UID: "dfdf38eb-05df-43ec-acb7-258da19c432a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.534668 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "dfdf38eb-05df-43ec-acb7-258da19c432a" (UID: "dfdf38eb-05df-43ec-acb7-258da19c432a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.546163 4849 generic.go:334] "Generic (PLEG): container finished" podID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerID="227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67" exitCode=0 Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.546203 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f644b5b-zlgdg" event={"ID":"dfdf38eb-05df-43ec-acb7-258da19c432a","Type":"ContainerDied","Data":"227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67"} Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.546228 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4f644b5b-zlgdg" event={"ID":"dfdf38eb-05df-43ec-acb7-258da19c432a","Type":"ContainerDied","Data":"18fbcc631155497caad7a263000aeae1578eb13c7c25dc181f5b3a56c6d8efdb"} Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.546245 4849 scope.go:117] "RemoveContainer" containerID="4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.546361 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4f644b5b-zlgdg" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.555808 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.555852 4849 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.555865 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.555877 4849 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dfdf38eb-05df-43ec-acb7-258da19c432a-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.555885 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbv4s\" (UniqueName: \"kubernetes.io/projected/dfdf38eb-05df-43ec-acb7-258da19c432a-kube-api-access-bbv4s\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.617496 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b4f644b5b-zlgdg"] Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.618773 4849 scope.go:117] "RemoveContainer" containerID="227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.625121 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b4f644b5b-zlgdg"] Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.644782 4849 scope.go:117] "RemoveContainer" containerID="4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6" Mar 20 13:45:35 crc kubenswrapper[4849]: E0320 13:45:35.645176 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6\": container with ID starting with 4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6 not found: ID does not exist" containerID="4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.645204 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6"} err="failed to get container status \"4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6\": rpc error: code = NotFound desc = could not find container \"4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6\": container with ID starting with 4d784ca893b7c4d6b48ece9efc1a6cdfc362f2a066b29a15fe07f46ea85b13e6 not found: ID does not exist" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.645223 4849 scope.go:117] "RemoveContainer" containerID="227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67" Mar 20 13:45:35 crc kubenswrapper[4849]: E0320 13:45:35.645513 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67\": container with ID starting with 227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67 not found: ID does not exist" containerID="227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.645533 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67"} err="failed to get container status \"227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67\": rpc error: code = NotFound desc = could not find container \"227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67\": container with ID starting with 227d21fb5a59967e11cb46caa4c2c8a74d1a01c887890c5889b9b346cdd03a67 not found: ID does not exist" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.932091 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hgx5j"] Mar 20 13:45:35 crc kubenswrapper[4849]: E0320 13:45:35.932509 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-httpd" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.932535 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-httpd" Mar 20 13:45:35 crc kubenswrapper[4849]: E0320 13:45:35.932576 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-api" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.932585 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-api" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.932785 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-api" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.932852 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" containerName="neutron-httpd" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.933591 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.944262 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hgx5j"] Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.961883 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvwt\" (UniqueName: \"kubernetes.io/projected/fc762765-75e8-42df-bfd0-86cbad8172b3-kube-api-access-sqvwt\") pod \"nova-api-db-create-hgx5j\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:35 crc kubenswrapper[4849]: I0320 13:45:35.961985 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc762765-75e8-42df-bfd0-86cbad8172b3-operator-scripts\") pod \"nova-api-db-create-hgx5j\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.037715 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-2kmcz"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.039053 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.049665 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2kmcz"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.063403 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d86487-6a56-429a-a4af-afc82ed6a843-operator-scripts\") pod \"nova-cell0-db-create-2kmcz\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.063599 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvwt\" (UniqueName: \"kubernetes.io/projected/fc762765-75e8-42df-bfd0-86cbad8172b3-kube-api-access-sqvwt\") pod \"nova-api-db-create-hgx5j\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.063710 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhl7\" (UniqueName: \"kubernetes.io/projected/b1d86487-6a56-429a-a4af-afc82ed6a843-kube-api-access-4lhl7\") pod \"nova-cell0-db-create-2kmcz\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.063777 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc762765-75e8-42df-bfd0-86cbad8172b3-operator-scripts\") pod \"nova-api-db-create-hgx5j\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.067321 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc762765-75e8-42df-bfd0-86cbad8172b3-operator-scripts\") pod \"nova-api-db-create-hgx5j\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.084108 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvwt\" (UniqueName: \"kubernetes.io/projected/fc762765-75e8-42df-bfd0-86cbad8172b3-kube-api-access-sqvwt\") pod \"nova-api-db-create-hgx5j\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.137715 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-653c-account-create-update-7p7tb"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.139136 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.142879 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.147638 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-653c-account-create-update-7p7tb"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.165019 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d86487-6a56-429a-a4af-afc82ed6a843-operator-scripts\") pod \"nova-cell0-db-create-2kmcz\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.165393 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhl7\" (UniqueName: \"kubernetes.io/projected/b1d86487-6a56-429a-a4af-afc82ed6a843-kube-api-access-4lhl7\") pod \"nova-cell0-db-create-2kmcz\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.167576 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d86487-6a56-429a-a4af-afc82ed6a843-operator-scripts\") pod \"nova-cell0-db-create-2kmcz\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.180893 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhl7\" (UniqueName: \"kubernetes.io/projected/b1d86487-6a56-429a-a4af-afc82ed6a843-kube-api-access-4lhl7\") pod \"nova-cell0-db-create-2kmcz\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.236710 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4lmj8"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.237858 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.245308 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4lmj8"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.260919 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.267289 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5336158a-f129-45cf-a73c-5e0733002023-operator-scripts\") pod \"nova-api-653c-account-create-update-7p7tb\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.267325 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9llq4\" (UniqueName: \"kubernetes.io/projected/5336158a-f129-45cf-a73c-5e0733002023-kube-api-access-9llq4\") pod \"nova-api-653c-account-create-update-7p7tb\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.366351 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5bc4-account-create-update-dggjx"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.367519 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.368123 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb98m\" (UniqueName: \"kubernetes.io/projected/5fafae6e-b99e-4561-8caa-84a392b5e463-kube-api-access-gb98m\") pod \"nova-cell1-db-create-4lmj8\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.368183 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fafae6e-b99e-4561-8caa-84a392b5e463-operator-scripts\") pod \"nova-cell1-db-create-4lmj8\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.368284 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5336158a-f129-45cf-a73c-5e0733002023-operator-scripts\") pod \"nova-api-653c-account-create-update-7p7tb\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.369284 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9llq4\" (UniqueName: \"kubernetes.io/projected/5336158a-f129-45cf-a73c-5e0733002023-kube-api-access-9llq4\") pod \"nova-api-653c-account-create-update-7p7tb\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.369367 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5336158a-f129-45cf-a73c-5e0733002023-operator-scripts\") pod \"nova-api-653c-account-create-update-7p7tb\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.369534 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.370229 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.393677 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9llq4\" (UniqueName: \"kubernetes.io/projected/5336158a-f129-45cf-a73c-5e0733002023-kube-api-access-9llq4\") pod \"nova-api-653c-account-create-update-7p7tb\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.397783 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5bc4-account-create-update-dggjx"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.457105 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.472026 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb98m\" (UniqueName: \"kubernetes.io/projected/5fafae6e-b99e-4561-8caa-84a392b5e463-kube-api-access-gb98m\") pod \"nova-cell1-db-create-4lmj8\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.472262 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vw4\" (UniqueName: \"kubernetes.io/projected/84081e43-a4b1-4462-9b31-21d5d443d016-kube-api-access-z2vw4\") pod \"nova-cell0-5bc4-account-create-update-dggjx\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.472321 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84081e43-a4b1-4462-9b31-21d5d443d016-operator-scripts\") pod \"nova-cell0-5bc4-account-create-update-dggjx\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.472356 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fafae6e-b99e-4561-8caa-84a392b5e463-operator-scripts\") pod \"nova-cell1-db-create-4lmj8\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.473325 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fafae6e-b99e-4561-8caa-84a392b5e463-operator-scripts\") pod \"nova-cell1-db-create-4lmj8\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.491807 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb98m\" (UniqueName: \"kubernetes.io/projected/5fafae6e-b99e-4561-8caa-84a392b5e463-kube-api-access-gb98m\") pod \"nova-cell1-db-create-4lmj8\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.555767 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c417-account-create-update-x9vgq"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.559997 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.563481 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.567500 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c417-account-create-update-x9vgq"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.577220 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2vw4\" (UniqueName: \"kubernetes.io/projected/84081e43-a4b1-4462-9b31-21d5d443d016-kube-api-access-z2vw4\") pod \"nova-cell0-5bc4-account-create-update-dggjx\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.577278 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84081e43-a4b1-4462-9b31-21d5d443d016-operator-scripts\") pod \"nova-cell0-5bc4-account-create-update-dggjx\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.578161 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84081e43-a4b1-4462-9b31-21d5d443d016-operator-scripts\") pod \"nova-cell0-5bc4-account-create-update-dggjx\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.603534 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2vw4\" (UniqueName: \"kubernetes.io/projected/84081e43-a4b1-4462-9b31-21d5d443d016-kube-api-access-z2vw4\") pod \"nova-cell0-5bc4-account-create-update-dggjx\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.667694 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.680579 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg89z\" (UniqueName: \"kubernetes.io/projected/d6fe63e8-731d-4c04-9679-25635974e8ce-kube-api-access-jg89z\") pod \"nova-cell1-c417-account-create-update-x9vgq\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.680899 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fe63e8-731d-4c04-9679-25635974e8ce-operator-scripts\") pod \"nova-cell1-c417-account-create-update-x9vgq\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.681408 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.784933 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg89z\" (UniqueName: \"kubernetes.io/projected/d6fe63e8-731d-4c04-9679-25635974e8ce-kube-api-access-jg89z\") pod \"nova-cell1-c417-account-create-update-x9vgq\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.785516 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fe63e8-731d-4c04-9679-25635974e8ce-operator-scripts\") pod \"nova-cell1-c417-account-create-update-x9vgq\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.786695 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fe63e8-731d-4c04-9679-25635974e8ce-operator-scripts\") pod \"nova-cell1-c417-account-create-update-x9vgq\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.816614 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg89z\" (UniqueName: \"kubernetes.io/projected/d6fe63e8-731d-4c04-9679-25635974e8ce-kube-api-access-jg89z\") pod \"nova-cell1-c417-account-create-update-x9vgq\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.841800 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hgx5j"] Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.888858 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:36 crc kubenswrapper[4849]: I0320 13:45:36.973302 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-2kmcz"] Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.059542 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfdf38eb-05df-43ec-acb7-258da19c432a" path="/var/lib/kubelet/pods/dfdf38eb-05df-43ec-acb7-258da19c432a/volumes" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.101658 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-653c-account-create-update-7p7tb"] Mar 20 13:45:37 crc kubenswrapper[4849]: W0320 13:45:37.116269 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5336158a_f129_45cf_a73c_5e0733002023.slice/crio-579b902ec38203e3a861c5a820a3a23cc6347230cb845b27872f07a8869eed64 WatchSource:0}: Error finding container 579b902ec38203e3a861c5a820a3a23cc6347230cb845b27872f07a8869eed64: Status 404 returned error can't find the container with id 579b902ec38203e3a861c5a820a3a23cc6347230cb845b27872f07a8869eed64 Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.253412 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4lmj8"] Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.390395 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5bc4-account-create-update-dggjx"] Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.475375 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c417-account-create-update-x9vgq"] Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.598564 4849 generic.go:334] "Generic (PLEG): container finished" podID="b1d86487-6a56-429a-a4af-afc82ed6a843" containerID="5f5e7b64fb88740552ddcefe1d16d6df85bd2a42ee1b612805991a0304c39adf" exitCode=0 Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.598649 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kmcz" event={"ID":"b1d86487-6a56-429a-a4af-afc82ed6a843","Type":"ContainerDied","Data":"5f5e7b64fb88740552ddcefe1d16d6df85bd2a42ee1b612805991a0304c39adf"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.598943 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kmcz" event={"ID":"b1d86487-6a56-429a-a4af-afc82ed6a843","Type":"ContainerStarted","Data":"c21d113cbddce24793dc8aaaf796ab9702d0dd46892f50a71c0ed7c7588ddb08"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.601548 4849 generic.go:334] "Generic (PLEG): container finished" podID="fc762765-75e8-42df-bfd0-86cbad8172b3" containerID="242c3caed3546905629881d9373bef7632d69447a8464addbbeb172f167eef6a" exitCode=0 Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.601609 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hgx5j" event={"ID":"fc762765-75e8-42df-bfd0-86cbad8172b3","Type":"ContainerDied","Data":"242c3caed3546905629881d9373bef7632d69447a8464addbbeb172f167eef6a"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.601632 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hgx5j" event={"ID":"fc762765-75e8-42df-bfd0-86cbad8172b3","Type":"ContainerStarted","Data":"9178da49c7c9ec4cdd1b83c3f75e311a161d57b24896f265f4c47e4ed4800964"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.604956 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" event={"ID":"d6fe63e8-731d-4c04-9679-25635974e8ce","Type":"ContainerStarted","Data":"2b9c3e7b69790e3b320b4faa37294ed3ad072fa6cfd70b8e2c58c003ecf1140e"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.606313 4849 generic.go:334] "Generic (PLEG): container finished" podID="5336158a-f129-45cf-a73c-5e0733002023" containerID="98cb3e5a09830707adebabf8d1ec2cb833590015c9eba996562647114642f205" exitCode=0 Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.606358 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-653c-account-create-update-7p7tb" event={"ID":"5336158a-f129-45cf-a73c-5e0733002023","Type":"ContainerDied","Data":"98cb3e5a09830707adebabf8d1ec2cb833590015c9eba996562647114642f205"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.606398 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-653c-account-create-update-7p7tb" event={"ID":"5336158a-f129-45cf-a73c-5e0733002023","Type":"ContainerStarted","Data":"579b902ec38203e3a861c5a820a3a23cc6347230cb845b27872f07a8869eed64"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.614434 4849 generic.go:334] "Generic (PLEG): container finished" podID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerID="0a9174601ae80fb33d84c0233114f71b6f537dcd6ab8a11c4d2eb6c6ce77db76" exitCode=137 Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.614509 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68899bcb64-snjqk" event={"ID":"4623c171-dfb8-42e6-9038-a95ed2871b75","Type":"ContainerDied","Data":"0a9174601ae80fb33d84c0233114f71b6f537dcd6ab8a11c4d2eb6c6ce77db76"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.616488 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4lmj8" event={"ID":"5fafae6e-b99e-4561-8caa-84a392b5e463","Type":"ContainerStarted","Data":"e38ae252582e21632d2936628a2cb4c7a9fd76f22b0af33e9f8e3783551d07de"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.620567 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" event={"ID":"84081e43-a4b1-4462-9b31-21d5d443d016","Type":"ContainerStarted","Data":"3d1f3917c78d40b6636702495cc124fed1e1b6fc58620eeafb35cdf4c1512f3a"} Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.752534 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913023 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-config-data\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913071 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4623c171-dfb8-42e6-9038-a95ed2871b75-logs\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913168 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-scripts\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913189 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-combined-ca-bundle\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913244 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/4623c171-dfb8-42e6-9038-a95ed2871b75-kube-api-access-8vzpx\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913281 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-tls-certs\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.913308 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-secret-key\") pod \"4623c171-dfb8-42e6-9038-a95ed2871b75\" (UID: \"4623c171-dfb8-42e6-9038-a95ed2871b75\") " Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.917764 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4623c171-dfb8-42e6-9038-a95ed2871b75-logs" (OuterVolumeSpecName: "logs") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.920762 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4623c171-dfb8-42e6-9038-a95ed2871b75-kube-api-access-8vzpx" (OuterVolumeSpecName: "kube-api-access-8vzpx") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "kube-api-access-8vzpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.921611 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.943725 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-scripts" (OuterVolumeSpecName: "scripts") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.954241 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.958619 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-config-data" (OuterVolumeSpecName: "config-data") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4849]: I0320 13:45:37.972035 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "4623c171-dfb8-42e6-9038-a95ed2871b75" (UID: "4623c171-dfb8-42e6-9038-a95ed2871b75"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.014873 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzpx\" (UniqueName: \"kubernetes.io/projected/4623c171-dfb8-42e6-9038-a95ed2871b75-kube-api-access-8vzpx\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.015037 4849 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.015113 4849 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.015201 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.015277 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4623c171-dfb8-42e6-9038-a95ed2871b75-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.015429 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4623c171-dfb8-42e6-9038-a95ed2871b75-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.015498 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4623c171-dfb8-42e6-9038-a95ed2871b75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.098913 4849 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fafae6e_b99e_4561_8caa_84a392b5e463.slice/crio-conmon-ee8847e70eb8c7872f30a5b82c3d3210d7e9f84b912ec52e1657c62532e3b391.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fafae6e_b99e_4561_8caa_84a392b5e463.slice/crio-ee8847e70eb8c7872f30a5b82c3d3210d7e9f84b912ec52e1657c62532e3b391.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.260015 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325009 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-config-data\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325061 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-scripts\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325084 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-run-httpd\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325156 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-combined-ca-bundle\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325180 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b568t\" (UniqueName: \"kubernetes.io/projected/99613c42-8bb0-416a-99ee-22c349eaf052-kube-api-access-b568t\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325200 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-log-httpd\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.325240 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-sg-core-conf-yaml\") pod \"99613c42-8bb0-416a-99ee-22c349eaf052\" (UID: \"99613c42-8bb0-416a-99ee-22c349eaf052\") " Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.326313 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.336146 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.338037 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-scripts" (OuterVolumeSpecName: "scripts") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.344316 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99613c42-8bb0-416a-99ee-22c349eaf052-kube-api-access-b568t" (OuterVolumeSpecName: "kube-api-access-b568t") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "kube-api-access-b568t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.402865 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.427207 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b568t\" (UniqueName: \"kubernetes.io/projected/99613c42-8bb0-416a-99ee-22c349eaf052-kube-api-access-b568t\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.427233 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.427243 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.427251 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.427259 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99613c42-8bb0-416a-99ee-22c349eaf052-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.479712 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-config-data" (OuterVolumeSpecName: "config-data") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.484952 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99613c42-8bb0-416a-99ee-22c349eaf052" (UID: "99613c42-8bb0-416a-99ee-22c349eaf052"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.528886 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.528923 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99613c42-8bb0-416a-99ee-22c349eaf052-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.631963 4849 generic.go:334] "Generic (PLEG): container finished" podID="84081e43-a4b1-4462-9b31-21d5d443d016" containerID="9f9c7ef807bdbe8189489ac4538fbbe1b937cc89119c08518ef8d37fd4ed385a" exitCode=0 Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.632022 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" event={"ID":"84081e43-a4b1-4462-9b31-21d5d443d016","Type":"ContainerDied","Data":"9f9c7ef807bdbe8189489ac4538fbbe1b937cc89119c08518ef8d37fd4ed385a"} Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.633731 4849 generic.go:334] "Generic (PLEG): container finished" podID="d6fe63e8-731d-4c04-9679-25635974e8ce" containerID="2d08047a2ee6678fdad94ab7e5aa89afb84af0849d91594a796ef4ec95434b50" exitCode=0 Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.633809 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" event={"ID":"d6fe63e8-731d-4c04-9679-25635974e8ce","Type":"ContainerDied","Data":"2d08047a2ee6678fdad94ab7e5aa89afb84af0849d91594a796ef4ec95434b50"} Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.641549 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68899bcb64-snjqk" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.641623 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68899bcb64-snjqk" event={"ID":"4623c171-dfb8-42e6-9038-a95ed2871b75","Type":"ContainerDied","Data":"44ee84ff30fb1326cc969d97bb705da62389e6f9763085b1f95fb6fd97bbf1e9"} Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.641677 4849 scope.go:117] "RemoveContainer" containerID="717f03af8e71b4627fc46deedfa0ff5f51c7251a3525a9f6fbfab7cf39df9d1d" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.644037 4849 generic.go:334] "Generic (PLEG): container finished" podID="99613c42-8bb0-416a-99ee-22c349eaf052" containerID="a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d" exitCode=0 Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.644089 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerDied","Data":"a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d"} Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.644113 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99613c42-8bb0-416a-99ee-22c349eaf052","Type":"ContainerDied","Data":"f76b1229ffbcc2c78b2e628dd4bcc4064a5bb8339f38f6358e1541fb5a34397d"} Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.644162 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.647709 4849 generic.go:334] "Generic (PLEG): container finished" podID="5fafae6e-b99e-4561-8caa-84a392b5e463" containerID="ee8847e70eb8c7872f30a5b82c3d3210d7e9f84b912ec52e1657c62532e3b391" exitCode=0 Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.647958 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4lmj8" event={"ID":"5fafae6e-b99e-4561-8caa-84a392b5e463","Type":"ContainerDied","Data":"ee8847e70eb8c7872f30a5b82c3d3210d7e9f84b912ec52e1657c62532e3b391"} Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.695635 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.710733 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.764913 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.766081 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="proxy-httpd" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766107 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="proxy-httpd" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.766121 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-notification-agent" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766128 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-notification-agent" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.766142 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon-log" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766149 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon-log" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.766163 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="sg-core" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766169 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="sg-core" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.766187 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-central-agent" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766193 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-central-agent" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.766200 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766206 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766378 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="sg-core" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766390 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-notification-agent" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766401 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="proxy-httpd" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766417 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" containerName="ceilometer-central-agent" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766430 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.766439 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" containerName="horizon-log" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.768236 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.771372 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.771487 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.778344 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.796311 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68899bcb64-snjqk"] Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.807103 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68899bcb64-snjqk"] Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.840885 4849 scope.go:117] "RemoveContainer" containerID="0a9174601ae80fb33d84c0233114f71b6f537dcd6ab8a11c4d2eb6c6ce77db76" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.858976 4849 scope.go:117] "RemoveContainer" containerID="740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.876067 4849 scope.go:117] "RemoveContainer" containerID="c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.895042 4849 scope.go:117] "RemoveContainer" containerID="7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.931028 4849 scope.go:117] "RemoveContainer" containerID="a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935132 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-log-httpd\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935191 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-run-httpd\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935438 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-config-data\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935468 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjzv\" (UniqueName: \"kubernetes.io/projected/1704cf65-fb71-4b3d-aea6-996d785fa6f4-kube-api-access-fbjzv\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935525 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935566 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.935612 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-scripts\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.953219 4849 scope.go:117] "RemoveContainer" containerID="740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.954351 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2\": container with ID starting with 740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2 not found: ID does not exist" containerID="740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.954387 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2"} err="failed to get container status \"740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2\": rpc error: code = NotFound desc = could not find container \"740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2\": container with ID starting with 740f1dbbca74852ec1915c8a31c94f79c411b10884395c895e15807592bcc6e2 not found: ID does not exist" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.954414 4849 scope.go:117] "RemoveContainer" containerID="c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.955114 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f\": container with ID starting with c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f not found: ID does not exist" containerID="c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.955220 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f"} err="failed to get container status \"c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f\": rpc error: code = NotFound desc = could not find container \"c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f\": container with ID starting with c8627e0f4cdba9d2ff72e5b8dc6077f812670cd313a03e8350353de9f4abfe8f not found: ID does not exist" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.955253 4849 scope.go:117] "RemoveContainer" containerID="7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.955879 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277\": container with ID starting with 7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277 not found: ID does not exist" containerID="7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.955905 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277"} err="failed to get container status \"7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277\": rpc error: code = NotFound desc = could not find container \"7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277\": container with ID starting with 7c4fc7467e98a52bcf798d3ecaca373ce5ce62685169330f04cf4a2b51d18277 not found: ID does not exist" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.955925 4849 scope.go:117] "RemoveContainer" containerID="a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d" Mar 20 13:45:38 crc kubenswrapper[4849]: E0320 13:45:38.956307 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d\": container with ID starting with a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d not found: ID does not exist" containerID="a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d" Mar 20 13:45:38 crc kubenswrapper[4849]: I0320 13:45:38.956358 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d"} err="failed to get container status \"a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d\": rpc error: code = NotFound desc = could not find container \"a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d\": container with ID starting with a2d29604cf16381c079db5eb5b33cd52b77d07ed48a7b581d2bb85435ef7d89d not found: ID does not exist" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.023769 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.044491 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-config-data\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.044544 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjzv\" (UniqueName: \"kubernetes.io/projected/1704cf65-fb71-4b3d-aea6-996d785fa6f4-kube-api-access-fbjzv\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.044585 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.044612 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.044658 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-scripts\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.048333 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-log-httpd\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.048388 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-run-httpd\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.048829 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-run-httpd\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.049095 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-log-httpd\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.068672 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.068686 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjzv\" (UniqueName: \"kubernetes.io/projected/1704cf65-fb71-4b3d-aea6-996d785fa6f4-kube-api-access-fbjzv\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.089774 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-config-data\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.093271 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.095119 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4623c171-dfb8-42e6-9038-a95ed2871b75" path="/var/lib/kubelet/pods/4623c171-dfb8-42e6-9038-a95ed2871b75/volumes" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.096113 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-scripts\") pod \"ceilometer-0\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.096498 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99613c42-8bb0-416a-99ee-22c349eaf052" path="/var/lib/kubelet/pods/99613c42-8bb0-416a-99ee-22c349eaf052/volumes" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.097485 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.144283 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.150355 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9llq4\" (UniqueName: \"kubernetes.io/projected/5336158a-f129-45cf-a73c-5e0733002023-kube-api-access-9llq4\") pod \"5336158a-f129-45cf-a73c-5e0733002023\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.150428 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5336158a-f129-45cf-a73c-5e0733002023-operator-scripts\") pod \"5336158a-f129-45cf-a73c-5e0733002023\" (UID: \"5336158a-f129-45cf-a73c-5e0733002023\") " Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.150471 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvwt\" (UniqueName: \"kubernetes.io/projected/fc762765-75e8-42df-bfd0-86cbad8172b3-kube-api-access-sqvwt\") pod \"fc762765-75e8-42df-bfd0-86cbad8172b3\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.150673 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc762765-75e8-42df-bfd0-86cbad8172b3-operator-scripts\") pod \"fc762765-75e8-42df-bfd0-86cbad8172b3\" (UID: \"fc762765-75e8-42df-bfd0-86cbad8172b3\") " Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.152110 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc762765-75e8-42df-bfd0-86cbad8172b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc762765-75e8-42df-bfd0-86cbad8172b3" (UID: "fc762765-75e8-42df-bfd0-86cbad8172b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.152924 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5336158a-f129-45cf-a73c-5e0733002023-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5336158a-f129-45cf-a73c-5e0733002023" (UID: "5336158a-f129-45cf-a73c-5e0733002023"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.155887 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.156543 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5336158a-f129-45cf-a73c-5e0733002023-kube-api-access-9llq4" (OuterVolumeSpecName: "kube-api-access-9llq4") pod "5336158a-f129-45cf-a73c-5e0733002023" (UID: "5336158a-f129-45cf-a73c-5e0733002023"). InnerVolumeSpecName "kube-api-access-9llq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.156625 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc762765-75e8-42df-bfd0-86cbad8172b3-kube-api-access-sqvwt" (OuterVolumeSpecName: "kube-api-access-sqvwt") pod "fc762765-75e8-42df-bfd0-86cbad8172b3" (UID: "fc762765-75e8-42df-bfd0-86cbad8172b3"). InnerVolumeSpecName "kube-api-access-sqvwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.252743 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d86487-6a56-429a-a4af-afc82ed6a843-operator-scripts\") pod \"b1d86487-6a56-429a-a4af-afc82ed6a843\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.252977 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhl7\" (UniqueName: \"kubernetes.io/projected/b1d86487-6a56-429a-a4af-afc82ed6a843-kube-api-access-4lhl7\") pod \"b1d86487-6a56-429a-a4af-afc82ed6a843\" (UID: \"b1d86487-6a56-429a-a4af-afc82ed6a843\") " Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.253276 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1d86487-6a56-429a-a4af-afc82ed6a843-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1d86487-6a56-429a-a4af-afc82ed6a843" (UID: "b1d86487-6a56-429a-a4af-afc82ed6a843"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.253598 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9llq4\" (UniqueName: \"kubernetes.io/projected/5336158a-f129-45cf-a73c-5e0733002023-kube-api-access-9llq4\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.253619 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5336158a-f129-45cf-a73c-5e0733002023-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.253661 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvwt\" (UniqueName: \"kubernetes.io/projected/fc762765-75e8-42df-bfd0-86cbad8172b3-kube-api-access-sqvwt\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.253673 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1d86487-6a56-429a-a4af-afc82ed6a843-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.253684 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc762765-75e8-42df-bfd0-86cbad8172b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.257600 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d86487-6a56-429a-a4af-afc82ed6a843-kube-api-access-4lhl7" (OuterVolumeSpecName: "kube-api-access-4lhl7") pod "b1d86487-6a56-429a-a4af-afc82ed6a843" (UID: "b1d86487-6a56-429a-a4af-afc82ed6a843"). InnerVolumeSpecName "kube-api-access-4lhl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.355763 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhl7\" (UniqueName: \"kubernetes.io/projected/b1d86487-6a56-429a-a4af-afc82ed6a843-kube-api-access-4lhl7\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.536117 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.664668 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerStarted","Data":"c93b41bfd68d476e096f73d432c5134b042bf422686a8107a6860d65beb3dac7"} Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.671465 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-2kmcz" event={"ID":"b1d86487-6a56-429a-a4af-afc82ed6a843","Type":"ContainerDied","Data":"c21d113cbddce24793dc8aaaf796ab9702d0dd46892f50a71c0ed7c7588ddb08"} Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.671497 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-2kmcz" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.671507 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c21d113cbddce24793dc8aaaf796ab9702d0dd46892f50a71c0ed7c7588ddb08" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.673214 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hgx5j" event={"ID":"fc762765-75e8-42df-bfd0-86cbad8172b3","Type":"ContainerDied","Data":"9178da49c7c9ec4cdd1b83c3f75e311a161d57b24896f265f4c47e4ed4800964"} Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.673243 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9178da49c7c9ec4cdd1b83c3f75e311a161d57b24896f265f4c47e4ed4800964" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.673315 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hgx5j" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.681215 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-653c-account-create-update-7p7tb" event={"ID":"5336158a-f129-45cf-a73c-5e0733002023","Type":"ContainerDied","Data":"579b902ec38203e3a861c5a820a3a23cc6347230cb845b27872f07a8869eed64"} Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.681259 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579b902ec38203e3a861c5a820a3a23cc6347230cb845b27872f07a8869eed64" Mar 20 13:45:39 crc kubenswrapper[4849]: I0320 13:45:39.681379 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-653c-account-create-update-7p7tb" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.077462 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.132887 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.138135 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.277364 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84081e43-a4b1-4462-9b31-21d5d443d016-operator-scripts\") pod \"84081e43-a4b1-4462-9b31-21d5d443d016\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.277418 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg89z\" (UniqueName: \"kubernetes.io/projected/d6fe63e8-731d-4c04-9679-25635974e8ce-kube-api-access-jg89z\") pod \"d6fe63e8-731d-4c04-9679-25635974e8ce\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.277487 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2vw4\" (UniqueName: \"kubernetes.io/projected/84081e43-a4b1-4462-9b31-21d5d443d016-kube-api-access-z2vw4\") pod \"84081e43-a4b1-4462-9b31-21d5d443d016\" (UID: \"84081e43-a4b1-4462-9b31-21d5d443d016\") " Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.278270 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fe63e8-731d-4c04-9679-25635974e8ce-operator-scripts\") pod \"d6fe63e8-731d-4c04-9679-25635974e8ce\" (UID: \"d6fe63e8-731d-4c04-9679-25635974e8ce\") " Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.277751 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84081e43-a4b1-4462-9b31-21d5d443d016-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84081e43-a4b1-4462-9b31-21d5d443d016" (UID: "84081e43-a4b1-4462-9b31-21d5d443d016"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.278338 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fafae6e-b99e-4561-8caa-84a392b5e463-operator-scripts\") pod \"5fafae6e-b99e-4561-8caa-84a392b5e463\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.279167 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fafae6e-b99e-4561-8caa-84a392b5e463-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fafae6e-b99e-4561-8caa-84a392b5e463" (UID: "5fafae6e-b99e-4561-8caa-84a392b5e463"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.279252 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fe63e8-731d-4c04-9679-25635974e8ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6fe63e8-731d-4c04-9679-25635974e8ce" (UID: "d6fe63e8-731d-4c04-9679-25635974e8ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.279468 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb98m\" (UniqueName: \"kubernetes.io/projected/5fafae6e-b99e-4561-8caa-84a392b5e463-kube-api-access-gb98m\") pod \"5fafae6e-b99e-4561-8caa-84a392b5e463\" (UID: \"5fafae6e-b99e-4561-8caa-84a392b5e463\") " Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.280294 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6fe63e8-731d-4c04-9679-25635974e8ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.280372 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fafae6e-b99e-4561-8caa-84a392b5e463-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.280438 4849 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84081e43-a4b1-4462-9b31-21d5d443d016-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.287101 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fe63e8-731d-4c04-9679-25635974e8ce-kube-api-access-jg89z" (OuterVolumeSpecName: "kube-api-access-jg89z") pod "d6fe63e8-731d-4c04-9679-25635974e8ce" (UID: "d6fe63e8-731d-4c04-9679-25635974e8ce"). InnerVolumeSpecName "kube-api-access-jg89z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.287214 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fafae6e-b99e-4561-8caa-84a392b5e463-kube-api-access-gb98m" (OuterVolumeSpecName: "kube-api-access-gb98m") pod "5fafae6e-b99e-4561-8caa-84a392b5e463" (UID: "5fafae6e-b99e-4561-8caa-84a392b5e463"). InnerVolumeSpecName "kube-api-access-gb98m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.287797 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84081e43-a4b1-4462-9b31-21d5d443d016-kube-api-access-z2vw4" (OuterVolumeSpecName: "kube-api-access-z2vw4") pod "84081e43-a4b1-4462-9b31-21d5d443d016" (UID: "84081e43-a4b1-4462-9b31-21d5d443d016"). InnerVolumeSpecName "kube-api-access-z2vw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.381704 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb98m\" (UniqueName: \"kubernetes.io/projected/5fafae6e-b99e-4561-8caa-84a392b5e463-kube-api-access-gb98m\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.381734 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg89z\" (UniqueName: \"kubernetes.io/projected/d6fe63e8-731d-4c04-9679-25635974e8ce-kube-api-access-jg89z\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.381745 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2vw4\" (UniqueName: \"kubernetes.io/projected/84081e43-a4b1-4462-9b31-21d5d443d016-kube-api-access-z2vw4\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.693993 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4lmj8" event={"ID":"5fafae6e-b99e-4561-8caa-84a392b5e463","Type":"ContainerDied","Data":"e38ae252582e21632d2936628a2cb4c7a9fd76f22b0af33e9f8e3783551d07de"} Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.694327 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38ae252582e21632d2936628a2cb4c7a9fd76f22b0af33e9f8e3783551d07de" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.694065 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4lmj8" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.696872 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" event={"ID":"84081e43-a4b1-4462-9b31-21d5d443d016","Type":"ContainerDied","Data":"3d1f3917c78d40b6636702495cc124fed1e1b6fc58620eeafb35cdf4c1512f3a"} Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.696920 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d1f3917c78d40b6636702495cc124fed1e1b6fc58620eeafb35cdf4c1512f3a" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.696984 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5bc4-account-create-update-dggjx" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.710512 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" event={"ID":"d6fe63e8-731d-4c04-9679-25635974e8ce","Type":"ContainerDied","Data":"2b9c3e7b69790e3b320b4faa37294ed3ad072fa6cfd70b8e2c58c003ecf1140e"} Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.710553 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b9c3e7b69790e3b320b4faa37294ed3ad072fa6cfd70b8e2c58c003ecf1140e" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.710638 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c417-account-create-update-x9vgq" Mar 20 13:45:40 crc kubenswrapper[4849]: I0320 13:45:40.714940 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerStarted","Data":"76af91d639a2dc8d076e7e594201aa4f7d84b653bc180636b7d3a78b9239a578"} Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.533598 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.554614 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bsbt2"] Mar 20 13:45:41 crc kubenswrapper[4849]: E0320 13:45:41.555391 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fafae6e-b99e-4561-8caa-84a392b5e463" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555410 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fafae6e-b99e-4561-8caa-84a392b5e463" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: E0320 13:45:41.555432 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5336158a-f129-45cf-a73c-5e0733002023" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555439 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="5336158a-f129-45cf-a73c-5e0733002023" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: E0320 13:45:41.555463 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fe63e8-731d-4c04-9679-25635974e8ce" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555469 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fe63e8-731d-4c04-9679-25635974e8ce" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: E0320 13:45:41.555486 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84081e43-a4b1-4462-9b31-21d5d443d016" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555493 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="84081e43-a4b1-4462-9b31-21d5d443d016" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: E0320 13:45:41.555509 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1d86487-6a56-429a-a4af-afc82ed6a843" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555516 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d86487-6a56-429a-a4af-afc82ed6a843" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: E0320 13:45:41.555542 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc762765-75e8-42df-bfd0-86cbad8172b3" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555548 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc762765-75e8-42df-bfd0-86cbad8172b3" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555894 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fe63e8-731d-4c04-9679-25635974e8ce" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555914 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc762765-75e8-42df-bfd0-86cbad8172b3" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555932 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1d86487-6a56-429a-a4af-afc82ed6a843" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555947 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fafae6e-b99e-4561-8caa-84a392b5e463" containerName="mariadb-database-create" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555966 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="5336158a-f129-45cf-a73c-5e0733002023" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.555979 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="84081e43-a4b1-4462-9b31-21d5d443d016" containerName="mariadb-account-create-update" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.566879 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.570422 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-25pzz" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.571054 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.571621 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.578730 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bsbt2"] Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.614712 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-scripts\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.614805 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.614883 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mg4\" (UniqueName: \"kubernetes.io/projected/c100d127-fda4-4f86-89d7-64a19be3e8ea-kube-api-access-s4mg4\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.614924 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-config-data\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.718767 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-scripts\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.718866 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.718913 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mg4\" (UniqueName: \"kubernetes.io/projected/c100d127-fda4-4f86-89d7-64a19be3e8ea-kube-api-access-s4mg4\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.718952 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-config-data\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.726104 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-scripts\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.728734 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-config-data\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.731603 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.742229 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mg4\" (UniqueName: \"kubernetes.io/projected/c100d127-fda4-4f86-89d7-64a19be3e8ea-kube-api-access-s4mg4\") pod \"nova-cell0-conductor-db-sync-bsbt2\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.753357 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerStarted","Data":"3e2c5512c41c187bd116a0b56b8ac9c4b914eb19e6b5098539005f067b781b72"} Mar 20 13:45:41 crc kubenswrapper[4849]: I0320 13:45:41.897652 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:45:42 crc kubenswrapper[4849]: I0320 13:45:42.399324 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bsbt2"] Mar 20 13:45:42 crc kubenswrapper[4849]: W0320 13:45:42.402591 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc100d127_fda4_4f86_89d7_64a19be3e8ea.slice/crio-f811d008376c3b685ae0b4f0c20929eb5fe9d88ac1c8ddec1417fe13e6663d0b WatchSource:0}: Error finding container f811d008376c3b685ae0b4f0c20929eb5fe9d88ac1c8ddec1417fe13e6663d0b: Status 404 returned error can't find the container with id f811d008376c3b685ae0b4f0c20929eb5fe9d88ac1c8ddec1417fe13e6663d0b Mar 20 13:45:42 crc kubenswrapper[4849]: I0320 13:45:42.764438 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerStarted","Data":"0c3e9ce7a2cd14c94b8a9ec977b3fc8c39388b90c6e62cae6f36af099d0e7520"} Mar 20 13:45:42 crc kubenswrapper[4849]: I0320 13:45:42.765751 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" event={"ID":"c100d127-fda4-4f86-89d7-64a19be3e8ea","Type":"ContainerStarted","Data":"f811d008376c3b685ae0b4f0c20929eb5fe9d88ac1c8ddec1417fe13e6663d0b"} Mar 20 13:45:42 crc kubenswrapper[4849]: I0320 13:45:42.843782 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:42 crc kubenswrapper[4849]: I0320 13:45:42.941309 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f89c68c76-ng5km" Mar 20 13:45:43 crc kubenswrapper[4849]: I0320 13:45:43.052157 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-58468695c6-sg4wf"] Mar 20 13:45:43 crc kubenswrapper[4849]: I0320 13:45:43.052417 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-58468695c6-sg4wf" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-log" containerID="cri-o://53e6f807bee5ada97b5510e01bdf755c584777eef52bc7f8051198ab6aa5e603" gracePeriod=30 Mar 20 13:45:43 crc kubenswrapper[4849]: I0320 13:45:43.053015 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-58468695c6-sg4wf" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-api" containerID="cri-o://fdfe8719c7f110e47650840b9f6a310716efb6a3efe80bd4fdb7f954b6cd1cd0" gracePeriod=30 Mar 20 13:45:43 crc kubenswrapper[4849]: I0320 13:45:43.777619 4849 generic.go:334] "Generic (PLEG): container finished" podID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerID="53e6f807bee5ada97b5510e01bdf755c584777eef52bc7f8051198ab6aa5e603" exitCode=143 Mar 20 13:45:43 crc kubenswrapper[4849]: I0320 13:45:43.777716 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58468695c6-sg4wf" event={"ID":"f33c702a-869d-44ae-ab1c-a52d1bb71740","Type":"ContainerDied","Data":"53e6f807bee5ada97b5510e01bdf755c584777eef52bc7f8051198ab6aa5e603"} Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.792715 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerStarted","Data":"27dd3d104ae54ee7e30b7a233a3a359867edfd5a27fd79e675685c611fe512ab"} Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.792878 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-central-agent" containerID="cri-o://76af91d639a2dc8d076e7e594201aa4f7d84b653bc180636b7d3a78b9239a578" gracePeriod=30 Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.792992 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="proxy-httpd" containerID="cri-o://27dd3d104ae54ee7e30b7a233a3a359867edfd5a27fd79e675685c611fe512ab" gracePeriod=30 Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.793069 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-notification-agent" containerID="cri-o://3e2c5512c41c187bd116a0b56b8ac9c4b914eb19e6b5098539005f067b781b72" gracePeriod=30 Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.793089 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.793090 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="sg-core" containerID="cri-o://0c3e9ce7a2cd14c94b8a9ec977b3fc8c39388b90c6e62cae6f36af099d0e7520" gracePeriod=30 Mar 20 13:45:44 crc kubenswrapper[4849]: I0320 13:45:44.826289 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.738172226 podStartE2EDuration="6.826266095s" podCreationTimestamp="2026-03-20 13:45:38 +0000 UTC" firstStartedPulling="2026-03-20 13:45:39.557738284 +0000 UTC m=+1289.235461679" lastFinishedPulling="2026-03-20 13:45:43.645832153 +0000 UTC m=+1293.323555548" observedRunningTime="2026-03-20 13:45:44.814323539 +0000 UTC m=+1294.492046944" watchObservedRunningTime="2026-03-20 13:45:44.826266095 +0000 UTC m=+1294.503989480" Mar 20 13:45:45 crc kubenswrapper[4849]: I0320 13:45:45.805033 4849 generic.go:334] "Generic (PLEG): container finished" podID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerID="27dd3d104ae54ee7e30b7a233a3a359867edfd5a27fd79e675685c611fe512ab" exitCode=0 Mar 20 13:45:45 crc kubenswrapper[4849]: I0320 13:45:45.805261 4849 generic.go:334] "Generic (PLEG): container finished" podID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerID="0c3e9ce7a2cd14c94b8a9ec977b3fc8c39388b90c6e62cae6f36af099d0e7520" exitCode=2 Mar 20 13:45:45 crc kubenswrapper[4849]: I0320 13:45:45.805272 4849 generic.go:334] "Generic (PLEG): container finished" podID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerID="3e2c5512c41c187bd116a0b56b8ac9c4b914eb19e6b5098539005f067b781b72" exitCode=0 Mar 20 13:45:45 crc kubenswrapper[4849]: I0320 13:45:45.805114 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerDied","Data":"27dd3d104ae54ee7e30b7a233a3a359867edfd5a27fd79e675685c611fe512ab"} Mar 20 13:45:45 crc kubenswrapper[4849]: I0320 13:45:45.805311 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerDied","Data":"0c3e9ce7a2cd14c94b8a9ec977b3fc8c39388b90c6e62cae6f36af099d0e7520"} Mar 20 13:45:45 crc kubenswrapper[4849]: I0320 13:45:45.805327 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerDied","Data":"3e2c5512c41c187bd116a0b56b8ac9c4b914eb19e6b5098539005f067b781b72"} Mar 20 13:45:46 crc kubenswrapper[4849]: I0320 13:45:46.816457 4849 generic.go:334] "Generic (PLEG): container finished" podID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerID="fdfe8719c7f110e47650840b9f6a310716efb6a3efe80bd4fdb7f954b6cd1cd0" exitCode=0 Mar 20 13:45:46 crc kubenswrapper[4849]: I0320 13:45:46.816522 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58468695c6-sg4wf" event={"ID":"f33c702a-869d-44ae-ab1c-a52d1bb71740","Type":"ContainerDied","Data":"fdfe8719c7f110e47650840b9f6a310716efb6a3efe80bd4fdb7f954b6cd1cd0"} Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.222405 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.225572 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-httpd" containerID="cri-o://358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f" gracePeriod=30 Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.225920 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-log" containerID="cri-o://ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65" gracePeriod=30 Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.835983 4849 generic.go:334] "Generic (PLEG): container finished" podID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerID="ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65" exitCode=143 Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.836552 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0","Type":"ContainerDied","Data":"ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65"} Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.839252 4849 generic.go:334] "Generic (PLEG): container finished" podID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerID="76af91d639a2dc8d076e7e594201aa4f7d84b653bc180636b7d3a78b9239a578" exitCode=0 Mar 20 13:45:48 crc kubenswrapper[4849]: I0320 13:45:48.839278 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerDied","Data":"76af91d639a2dc8d076e7e594201aa4f7d84b653bc180636b7d3a78b9239a578"} Mar 20 13:45:49 crc kubenswrapper[4849]: I0320 13:45:49.585189 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:49 crc kubenswrapper[4849]: I0320 13:45:49.585478 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-log" containerID="cri-o://83d687f09c5f7f52eee6b78d59bb544c7b81984bbcadd67b79ad76c79bf77203" gracePeriod=30 Mar 20 13:45:49 crc kubenswrapper[4849]: I0320 13:45:49.585553 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-httpd" containerID="cri-o://5404262e6156bb6146e3f6d868367e7d68faec96efa2497a9920e6192b4a8e47" gracePeriod=30 Mar 20 13:45:49 crc kubenswrapper[4849]: I0320 13:45:49.875429 4849 generic.go:334] "Generic (PLEG): container finished" podID="cba85559-136f-44e8-abc0-569ff409b64f" containerID="83d687f09c5f7f52eee6b78d59bb544c7b81984bbcadd67b79ad76c79bf77203" exitCode=143 Mar 20 13:45:49 crc kubenswrapper[4849]: I0320 13:45:49.875647 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba85559-136f-44e8-abc0-569ff409b64f","Type":"ContainerDied","Data":"83d687f09c5f7f52eee6b78d59bb544c7b81984bbcadd67b79ad76c79bf77203"} Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.302757 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.386425 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429423 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-internal-tls-certs\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429466 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-public-tls-certs\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429492 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpm8b\" (UniqueName: \"kubernetes.io/projected/f33c702a-869d-44ae-ab1c-a52d1bb71740-kube-api-access-wpm8b\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429547 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-config-data\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429567 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-combined-ca-bundle\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429682 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-scripts\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.429736 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33c702a-869d-44ae-ab1c-a52d1bb71740-logs\") pod \"f33c702a-869d-44ae-ab1c-a52d1bb71740\" (UID: \"f33c702a-869d-44ae-ab1c-a52d1bb71740\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.432861 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33c702a-869d-44ae-ab1c-a52d1bb71740-logs" (OuterVolumeSpecName: "logs") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.437002 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-scripts" (OuterVolumeSpecName: "scripts") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.437186 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33c702a-869d-44ae-ab1c-a52d1bb71740-kube-api-access-wpm8b" (OuterVolumeSpecName: "kube-api-access-wpm8b") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "kube-api-access-wpm8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.486984 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.491657 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-config-data" (OuterVolumeSpecName: "config-data") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534195 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-run-httpd\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534272 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbjzv\" (UniqueName: \"kubernetes.io/projected/1704cf65-fb71-4b3d-aea6-996d785fa6f4-kube-api-access-fbjzv\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534435 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-log-httpd\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534509 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-scripts\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534647 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-sg-core-conf-yaml\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534709 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-combined-ca-bundle\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.534769 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-config-data\") pod \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\" (UID: \"1704cf65-fb71-4b3d-aea6-996d785fa6f4\") " Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535087 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535534 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpm8b\" (UniqueName: \"kubernetes.io/projected/f33c702a-869d-44ae-ab1c-a52d1bb71740-kube-api-access-wpm8b\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535592 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535609 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535626 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535641 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.535712 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f33c702a-869d-44ae-ab1c-a52d1bb71740-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.536009 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.536014 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.538714 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-scripts" (OuterVolumeSpecName: "scripts") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.539384 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f33c702a-869d-44ae-ab1c-a52d1bb71740" (UID: "f33c702a-869d-44ae-ab1c-a52d1bb71740"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.540762 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1704cf65-fb71-4b3d-aea6-996d785fa6f4-kube-api-access-fbjzv" (OuterVolumeSpecName: "kube-api-access-fbjzv") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "kube-api-access-fbjzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.571495 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.615018 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.634975 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-config-data" (OuterVolumeSpecName: "config-data") pod "1704cf65-fb71-4b3d-aea6-996d785fa6f4" (UID: "1704cf65-fb71-4b3d-aea6-996d785fa6f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.637733 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638296 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638424 4849 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638571 4849 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f33c702a-869d-44ae-ab1c-a52d1bb71740-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638725 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638832 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbjzv\" (UniqueName: \"kubernetes.io/projected/1704cf65-fb71-4b3d-aea6-996d785fa6f4-kube-api-access-fbjzv\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638924 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1704cf65-fb71-4b3d-aea6-996d785fa6f4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.638992 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1704cf65-fb71-4b3d-aea6-996d785fa6f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.893960 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58468695c6-sg4wf" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.893965 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58468695c6-sg4wf" event={"ID":"f33c702a-869d-44ae-ab1c-a52d1bb71740","Type":"ContainerDied","Data":"eae167f063b1c375d544fd3efd75f9c0bb0b42e752142c6228502d68f51cfaf2"} Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.894393 4849 scope.go:117] "RemoveContainer" containerID="fdfe8719c7f110e47650840b9f6a310716efb6a3efe80bd4fdb7f954b6cd1cd0" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.900361 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1704cf65-fb71-4b3d-aea6-996d785fa6f4","Type":"ContainerDied","Data":"c93b41bfd68d476e096f73d432c5134b042bf422686a8107a6860d65beb3dac7"} Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.900439 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.903469 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" event={"ID":"c100d127-fda4-4f86-89d7-64a19be3e8ea","Type":"ContainerStarted","Data":"7ed75db7a040c41a3c2f0b158b085a98f26a5f20c673bec43eada26c5e893852"} Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.930933 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" podStartSLOduration=2.308348207 podStartE2EDuration="9.930916075s" podCreationTimestamp="2026-03-20 13:45:41 +0000 UTC" firstStartedPulling="2026-03-20 13:45:42.405023831 +0000 UTC m=+1292.082747226" lastFinishedPulling="2026-03-20 13:45:50.027591699 +0000 UTC m=+1299.705315094" observedRunningTime="2026-03-20 13:45:50.929521399 +0000 UTC m=+1300.607244794" watchObservedRunningTime="2026-03-20 13:45:50.930916075 +0000 UTC m=+1300.608639470" Mar 20 13:45:50 crc kubenswrapper[4849]: I0320 13:45:50.998059 4849 scope.go:117] "RemoveContainer" containerID="53e6f807bee5ada97b5510e01bdf755c584777eef52bc7f8051198ab6aa5e603" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.002195 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.016134 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.029974 4849 scope.go:117] "RemoveContainer" containerID="27dd3d104ae54ee7e30b7a233a3a359867edfd5a27fd79e675685c611fe512ab" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.030736 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-58468695c6-sg4wf"] Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.057991 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" path="/var/lib/kubelet/pods/1704cf65-fb71-4b3d-aea6-996d785fa6f4/volumes" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.058649 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-58468695c6-sg4wf"] Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.058679 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:51 crc kubenswrapper[4849]: E0320 13:45:51.058970 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-notification-agent" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.058989 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-notification-agent" Mar 20 13:45:51 crc kubenswrapper[4849]: E0320 13:45:51.058999 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="proxy-httpd" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059005 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="proxy-httpd" Mar 20 13:45:51 crc kubenswrapper[4849]: E0320 13:45:51.059015 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-central-agent" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059021 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-central-agent" Mar 20 13:45:51 crc kubenswrapper[4849]: E0320 13:45:51.059033 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="sg-core" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059038 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="sg-core" Mar 20 13:45:51 crc kubenswrapper[4849]: E0320 13:45:51.059047 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-log" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059052 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-log" Mar 20 13:45:51 crc kubenswrapper[4849]: E0320 13:45:51.059063 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-api" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059069 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-api" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059232 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="proxy-httpd" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059245 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="sg-core" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059256 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-notification-agent" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059267 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-log" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059278 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" containerName="placement-api" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059289 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="1704cf65-fb71-4b3d-aea6-996d785fa6f4" containerName="ceilometer-central-agent" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.059908 4849 scope.go:117] "RemoveContainer" containerID="0c3e9ce7a2cd14c94b8a9ec977b3fc8c39388b90c6e62cae6f36af099d0e7520" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.061251 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.061334 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.063925 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.064143 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.085870 4849 scope.go:117] "RemoveContainer" containerID="3e2c5512c41c187bd116a0b56b8ac9c4b914eb19e6b5098539005f067b781b72" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.111244 4849 scope.go:117] "RemoveContainer" containerID="76af91d639a2dc8d076e7e594201aa4f7d84b653bc180636b7d3a78b9239a578" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253532 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-run-httpd\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253578 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-config-data\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253618 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf68\" (UniqueName: \"kubernetes.io/projected/f447de3d-38fe-406a-a816-773b3779497a-kube-api-access-kcf68\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253640 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-scripts\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253731 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253789 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-log-httpd\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.253809 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.354979 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-run-httpd\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355023 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-config-data\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355078 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf68\" (UniqueName: \"kubernetes.io/projected/f447de3d-38fe-406a-a816-773b3779497a-kube-api-access-kcf68\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355109 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-scripts\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355136 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355196 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-log-httpd\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355220 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.355536 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-run-httpd\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.357369 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-log-httpd\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.359292 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.360359 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.361112 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-config-data\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.362308 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-scripts\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.381487 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf68\" (UniqueName: \"kubernetes.io/projected/f447de3d-38fe-406a-a816-773b3779497a-kube-api-access-kcf68\") pod \"ceilometer-0\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.385321 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.731685 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.777615 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-public-tls-certs\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.777663 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-scripts\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.777721 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.777750 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-combined-ca-bundle\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.777912 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-httpd-run\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.786562 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.788361 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.788385 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-scripts" (OuterVolumeSpecName: "scripts") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.843940 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.844257 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.879524 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhgk9\" (UniqueName: \"kubernetes.io/projected/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-kube-api-access-dhgk9\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.880121 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-config-data\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.880250 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-logs\") pod \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\" (UID: \"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0\") " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.880811 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.880936 4849 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.881008 4849 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.881090 4849 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.881160 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.886345 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-kube-api-access-dhgk9" (OuterVolumeSpecName: "kube-api-access-dhgk9") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "kube-api-access-dhgk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.886570 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-logs" (OuterVolumeSpecName: "logs") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.937289 4849 generic.go:334] "Generic (PLEG): container finished" podID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerID="358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f" exitCode=0 Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.937354 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0","Type":"ContainerDied","Data":"358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f"} Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.937381 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0","Type":"ContainerDied","Data":"ef5820865243e2fcf2f6f35e7acb8c711ebba03fe0607c843dd4d9d257a16e1b"} Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.937398 4849 scope.go:117] "RemoveContainer" containerID="358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.937526 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.969988 4849 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.983163 4849 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.983207 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhgk9\" (UniqueName: \"kubernetes.io/projected/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-kube-api-access-dhgk9\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:51 crc kubenswrapper[4849]: I0320 13:45:51.983222 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.000544 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-config-data" (OuterVolumeSpecName: "config-data") pod "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" (UID: "3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.009717 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.064649 4849 scope.go:117] "RemoveContainer" containerID="ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.083352 4849 scope.go:117] "RemoveContainer" containerID="358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.085080 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:52 crc kubenswrapper[4849]: E0320 13:45:52.085574 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f\": container with ID starting with 358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f not found: ID does not exist" containerID="358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.085617 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f"} err="failed to get container status \"358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f\": rpc error: code = NotFound desc = could not find container \"358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f\": container with ID starting with 358ce0ecbee13b3aad59aca610e9fb41c4914e4ca0848e139b72eb3a5bfa981f not found: ID does not exist" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.085644 4849 scope.go:117] "RemoveContainer" containerID="ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65" Mar 20 13:45:52 crc kubenswrapper[4849]: E0320 13:45:52.086043 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65\": container with ID starting with ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65 not found: ID does not exist" containerID="ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.086085 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65"} err="failed to get container status \"ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65\": rpc error: code = NotFound desc = could not find container \"ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65\": container with ID starting with ec92408006e9db1b14c37ee4046fc47dcc1834759461894cb06f16d651720f65 not found: ID does not exist" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.269525 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.276409 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.298934 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:52 crc kubenswrapper[4849]: E0320 13:45:52.299398 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-httpd" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.299419 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-httpd" Mar 20 13:45:52 crc kubenswrapper[4849]: E0320 13:45:52.299436 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-log" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.299444 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-log" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.299931 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-log" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.299975 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" containerName="glance-httpd" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.301366 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.303558 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.306956 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.332934 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490649 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwtvp\" (UniqueName: \"kubernetes.io/projected/0245248e-3173-455f-9610-41be03d97ab1-kube-api-access-qwtvp\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490720 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490740 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490757 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490789 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-scripts\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490807 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-config-data\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490852 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0245248e-3173-455f-9610-41be03d97ab1-logs\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.490877 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0245248e-3173-455f-9610-41be03d97ab1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592223 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-config-data\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592608 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0245248e-3173-455f-9610-41be03d97ab1-logs\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592655 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0245248e-3173-455f-9610-41be03d97ab1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592779 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwtvp\" (UniqueName: \"kubernetes.io/projected/0245248e-3173-455f-9610-41be03d97ab1-kube-api-access-qwtvp\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592864 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592894 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592913 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.592947 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-scripts\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.593100 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0245248e-3173-455f-9610-41be03d97ab1-logs\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.593452 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.594089 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0245248e-3173-455f-9610-41be03d97ab1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.599429 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-config-data\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.599835 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.601937 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.602908 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0245248e-3173-455f-9610-41be03d97ab1-scripts\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.619570 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwtvp\" (UniqueName: \"kubernetes.io/projected/0245248e-3173-455f-9610-41be03d97ab1-kube-api-access-qwtvp\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.629222 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-external-api-0\" (UID: \"0245248e-3173-455f-9610-41be03d97ab1\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.917167 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.981899 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerStarted","Data":"3190e192417242dccdf5cbb501847bc5e3a7c192d4fe9402165862b7c270340f"} Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.982206 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerStarted","Data":"3e7bcfe0811623bd37688090b8d71586400f2d2cb3c63c225bb74db684b06f29"} Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.994347 4849 generic.go:334] "Generic (PLEG): container finished" podID="cba85559-136f-44e8-abc0-569ff409b64f" containerID="5404262e6156bb6146e3f6d868367e7d68faec96efa2497a9920e6192b4a8e47" exitCode=0 Mar 20 13:45:52 crc kubenswrapper[4849]: I0320 13:45:52.994465 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba85559-136f-44e8-abc0-569ff409b64f","Type":"ContainerDied","Data":"5404262e6156bb6146e3f6d868367e7d68faec96efa2497a9920e6192b4a8e47"} Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.053136 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0" path="/var/lib/kubelet/pods/3fb2e9c5-af86-4c9c-8fcf-80a3bb92fda0/volumes" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.054465 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33c702a-869d-44ae-ab1c-a52d1bb71740" path="/var/lib/kubelet/pods/f33c702a-869d-44ae-ab1c-a52d1bb71740/volumes" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.324789 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.510972 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511019 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r5xl\" (UniqueName: \"kubernetes.io/projected/cba85559-136f-44e8-abc0-569ff409b64f-kube-api-access-4r5xl\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511061 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-scripts\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511121 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-internal-tls-certs\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511240 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-logs\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511263 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-httpd-run\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511287 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-config-data\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511350 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-combined-ca-bundle\") pod \"cba85559-136f-44e8-abc0-569ff409b64f\" (UID: \"cba85559-136f-44e8-abc0-569ff409b64f\") " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511652 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-logs" (OuterVolumeSpecName: "logs") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.511940 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.512128 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.516943 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.516992 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-scripts" (OuterVolumeSpecName: "scripts") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.516999 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba85559-136f-44e8-abc0-569ff409b64f-kube-api-access-4r5xl" (OuterVolumeSpecName: "kube-api-access-4r5xl") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "kube-api-access-4r5xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.540428 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.541476 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:53 crc kubenswrapper[4849]: W0320 13:45:53.551244 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0245248e_3173_455f_9610_41be03d97ab1.slice/crio-f1567dee65af88f4077768b33183dfee3a0c23431506f0c9b1157fd0c6754d10 WatchSource:0}: Error finding container f1567dee65af88f4077768b33183dfee3a0c23431506f0c9b1157fd0c6754d10: Status 404 returned error can't find the container with id f1567dee65af88f4077768b33183dfee3a0c23431506f0c9b1157fd0c6754d10 Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.569189 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-config-data" (OuterVolumeSpecName: "config-data") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.580453 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cba85559-136f-44e8-abc0-569ff409b64f" (UID: "cba85559-136f-44e8-abc0-569ff409b64f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614094 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614170 4849 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614187 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r5xl\" (UniqueName: \"kubernetes.io/projected/cba85559-136f-44e8-abc0-569ff409b64f-kube-api-access-4r5xl\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614203 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614216 4849 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614232 4849 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba85559-136f-44e8-abc0-569ff409b64f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.614243 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba85559-136f-44e8-abc0-569ff409b64f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.636255 4849 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 13:45:53 crc kubenswrapper[4849]: I0320 13:45:53.716072 4849 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.018490 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cba85559-136f-44e8-abc0-569ff409b64f","Type":"ContainerDied","Data":"93830b1da5108fd4904dc53843c848128ccf7fb55b9ef327dfe1bf0992864a17"} Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.018543 4849 scope.go:117] "RemoveContainer" containerID="5404262e6156bb6146e3f6d868367e7d68faec96efa2497a9920e6192b4a8e47" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.018646 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.023776 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0245248e-3173-455f-9610-41be03d97ab1","Type":"ContainerStarted","Data":"f1567dee65af88f4077768b33183dfee3a0c23431506f0c9b1157fd0c6754d10"} Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.026261 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerStarted","Data":"0411e9b1881e2a70ce08239972fe2503e2f76ee6c10f2e589607cd375f7d16db"} Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.057700 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.079082 4849 scope.go:117] "RemoveContainer" containerID="83d687f09c5f7f52eee6b78d59bb544c7b81984bbcadd67b79ad76c79bf77203" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.136891 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.165148 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:54 crc kubenswrapper[4849]: E0320 13:45:54.165596 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-log" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.165612 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-log" Mar 20 13:45:54 crc kubenswrapper[4849]: E0320 13:45:54.165639 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-httpd" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.165646 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-httpd" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.165809 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-httpd" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.165849 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba85559-136f-44e8-abc0-569ff409b64f" containerName="glance-log" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.166856 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.173545 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.173865 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.178420 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.238990 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239346 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239369 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239398 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239433 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239482 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239517 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.239559 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zv2w\" (UniqueName: \"kubernetes.io/projected/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-kube-api-access-8zv2w\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.341716 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zv2w\" (UniqueName: \"kubernetes.io/projected/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-kube-api-access-8zv2w\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.341796 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.341888 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.341908 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.341930 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.341968 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.342006 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.342035 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.343243 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.343385 4849 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.343762 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.349320 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.351111 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.356028 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.362494 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.363127 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zv2w\" (UniqueName: \"kubernetes.io/projected/9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99-kube-api-access-8zv2w\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.391989 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:54 crc kubenswrapper[4849]: I0320 13:45:54.623185 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:55 crc kubenswrapper[4849]: I0320 13:45:55.046991 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba85559-136f-44e8-abc0-569ff409b64f" path="/var/lib/kubelet/pods/cba85559-136f-44e8-abc0-569ff409b64f/volumes" Mar 20 13:45:55 crc kubenswrapper[4849]: I0320 13:45:55.048175 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0245248e-3173-455f-9610-41be03d97ab1","Type":"ContainerStarted","Data":"46818a5dd692b0f832dfe6c8b991c3d08e86ddea8e7885ba7b69bbf2d1031855"} Mar 20 13:45:55 crc kubenswrapper[4849]: I0320 13:45:55.048209 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0245248e-3173-455f-9610-41be03d97ab1","Type":"ContainerStarted","Data":"0b2f8ba99c7eff71f66496163f56a7c1460cb3d134ce251a37ea595e25379fdd"} Mar 20 13:45:55 crc kubenswrapper[4849]: I0320 13:45:55.048224 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerStarted","Data":"54e8fa3a413c75c90d57e5ddcc8f6506329abdcd4dc1597bf38927b0149676d2"} Mar 20 13:45:55 crc kubenswrapper[4849]: I0320 13:45:55.060138 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.060118797 podStartE2EDuration="3.060118797s" podCreationTimestamp="2026-03-20 13:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:55.056749089 +0000 UTC m=+1304.734472514" watchObservedRunningTime="2026-03-20 13:45:55.060118797 +0000 UTC m=+1304.737842192" Mar 20 13:45:55 crc kubenswrapper[4849]: I0320 13:45:55.195760 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:56 crc kubenswrapper[4849]: I0320 13:45:56.058940 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99","Type":"ContainerStarted","Data":"251fcb969cb6dacadebefe52333bcfcb584af29292bbd64a1e1031ff656ead94"} Mar 20 13:45:56 crc kubenswrapper[4849]: I0320 13:45:56.059265 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99","Type":"ContainerStarted","Data":"ffc08ba4962f96afbe296743b463b8a3b28e9c3540f1bafdeeccb111d3a7ecab"} Mar 20 13:45:57 crc kubenswrapper[4849]: I0320 13:45:57.070413 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerStarted","Data":"18ae6391048ec982f1e8063c5332f056357f2998d1e82fe55f9139ea3fff2d94"} Mar 20 13:45:57 crc kubenswrapper[4849]: I0320 13:45:57.071779 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:45:57 crc kubenswrapper[4849]: I0320 13:45:57.073622 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99","Type":"ContainerStarted","Data":"ab4395239c91ddff5f89b00e24a30f8d1e992faba810a0a482e3df8f5273d547"} Mar 20 13:45:57 crc kubenswrapper[4849]: I0320 13:45:57.100096 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.584462695 podStartE2EDuration="7.100076026s" podCreationTimestamp="2026-03-20 13:45:50 +0000 UTC" firstStartedPulling="2026-03-20 13:45:52.00199686 +0000 UTC m=+1301.679720245" lastFinishedPulling="2026-03-20 13:45:56.517610191 +0000 UTC m=+1306.195333576" observedRunningTime="2026-03-20 13:45:57.095408275 +0000 UTC m=+1306.773131670" watchObservedRunningTime="2026-03-20 13:45:57.100076026 +0000 UTC m=+1306.777799421" Mar 20 13:45:57 crc kubenswrapper[4849]: I0320 13:45:57.118717 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.118701072 podStartE2EDuration="3.118701072s" podCreationTimestamp="2026-03-20 13:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:57.115842537 +0000 UTC m=+1306.793565932" watchObservedRunningTime="2026-03-20 13:45:57.118701072 +0000 UTC m=+1306.796424467" Mar 20 13:45:57 crc kubenswrapper[4849]: I0320 13:45:57.812901 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:59 crc kubenswrapper[4849]: I0320 13:45:59.090492 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="sg-core" containerID="cri-o://54e8fa3a413c75c90d57e5ddcc8f6506329abdcd4dc1597bf38927b0149676d2" gracePeriod=30 Mar 20 13:45:59 crc kubenswrapper[4849]: I0320 13:45:59.090476 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-central-agent" containerID="cri-o://3190e192417242dccdf5cbb501847bc5e3a7c192d4fe9402165862b7c270340f" gracePeriod=30 Mar 20 13:45:59 crc kubenswrapper[4849]: I0320 13:45:59.090554 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="proxy-httpd" containerID="cri-o://18ae6391048ec982f1e8063c5332f056357f2998d1e82fe55f9139ea3fff2d94" gracePeriod=30 Mar 20 13:45:59 crc kubenswrapper[4849]: I0320 13:45:59.090583 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-notification-agent" containerID="cri-o://0411e9b1881e2a70ce08239972fe2503e2f76ee6c10f2e589607cd375f7d16db" gracePeriod=30 Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.099967 4849 generic.go:334] "Generic (PLEG): container finished" podID="f447de3d-38fe-406a-a816-773b3779497a" containerID="18ae6391048ec982f1e8063c5332f056357f2998d1e82fe55f9139ea3fff2d94" exitCode=0 Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100438 4849 generic.go:334] "Generic (PLEG): container finished" podID="f447de3d-38fe-406a-a816-773b3779497a" containerID="54e8fa3a413c75c90d57e5ddcc8f6506329abdcd4dc1597bf38927b0149676d2" exitCode=2 Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100448 4849 generic.go:334] "Generic (PLEG): container finished" podID="f447de3d-38fe-406a-a816-773b3779497a" containerID="0411e9b1881e2a70ce08239972fe2503e2f76ee6c10f2e589607cd375f7d16db" exitCode=0 Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100456 4849 generic.go:334] "Generic (PLEG): container finished" podID="f447de3d-38fe-406a-a816-773b3779497a" containerID="3190e192417242dccdf5cbb501847bc5e3a7c192d4fe9402165862b7c270340f" exitCode=0 Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100045 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerDied","Data":"18ae6391048ec982f1e8063c5332f056357f2998d1e82fe55f9139ea3fff2d94"} Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100483 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerDied","Data":"54e8fa3a413c75c90d57e5ddcc8f6506329abdcd4dc1597bf38927b0149676d2"} Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100492 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerDied","Data":"0411e9b1881e2a70ce08239972fe2503e2f76ee6c10f2e589607cd375f7d16db"} Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.100501 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerDied","Data":"3190e192417242dccdf5cbb501847bc5e3a7c192d4fe9402165862b7c270340f"} Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.140894 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566906-4pc5f"] Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.142281 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.145154 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.145425 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.147310 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.148767 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-4pc5f"] Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.208596 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.247699 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln7tj\" (UniqueName: \"kubernetes.io/projected/91dea7c8-c5a7-4c12-8e7b-8477fededeb5-kube-api-access-ln7tj\") pod \"auto-csr-approver-29566906-4pc5f\" (UID: \"91dea7c8-c5a7-4c12-8e7b-8477fededeb5\") " pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349247 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-sg-core-conf-yaml\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349329 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-config-data\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349360 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcf68\" (UniqueName: \"kubernetes.io/projected/f447de3d-38fe-406a-a816-773b3779497a-kube-api-access-kcf68\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349410 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-combined-ca-bundle\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349443 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-run-httpd\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349469 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-scripts\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.349745 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-log-httpd\") pod \"f447de3d-38fe-406a-a816-773b3779497a\" (UID: \"f447de3d-38fe-406a-a816-773b3779497a\") " Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.350371 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln7tj\" (UniqueName: \"kubernetes.io/projected/91dea7c8-c5a7-4c12-8e7b-8477fededeb5-kube-api-access-ln7tj\") pod \"auto-csr-approver-29566906-4pc5f\" (UID: \"91dea7c8-c5a7-4c12-8e7b-8477fededeb5\") " pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.350570 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.351186 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.356254 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-scripts" (OuterVolumeSpecName: "scripts") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.359658 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f447de3d-38fe-406a-a816-773b3779497a-kube-api-access-kcf68" (OuterVolumeSpecName: "kube-api-access-kcf68") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "kube-api-access-kcf68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.370113 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln7tj\" (UniqueName: \"kubernetes.io/projected/91dea7c8-c5a7-4c12-8e7b-8477fededeb5-kube-api-access-ln7tj\") pod \"auto-csr-approver-29566906-4pc5f\" (UID: \"91dea7c8-c5a7-4c12-8e7b-8477fededeb5\") " pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.391093 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.431508 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.447514 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-config-data" (OuterVolumeSpecName: "config-data") pod "f447de3d-38fe-406a-a816-773b3779497a" (UID: "f447de3d-38fe-406a-a816-773b3779497a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452468 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452497 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452509 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452518 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcf68\" (UniqueName: \"kubernetes.io/projected/f447de3d-38fe-406a-a816-773b3779497a-kube-api-access-kcf68\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452526 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452536 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f447de3d-38fe-406a-a816-773b3779497a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.452544 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f447de3d-38fe-406a-a816-773b3779497a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.502025 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:00 crc kubenswrapper[4849]: I0320 13:46:00.959765 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-4pc5f"] Mar 20 13:46:00 crc kubenswrapper[4849]: W0320 13:46:00.965489 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91dea7c8_c5a7_4c12_8e7b_8477fededeb5.slice/crio-cb8e0c5f9c062ea0b3f6b7f9b2faabca2882eb3261c43ef3ca62effa153ea7b9 WatchSource:0}: Error finding container cb8e0c5f9c062ea0b3f6b7f9b2faabca2882eb3261c43ef3ca62effa153ea7b9: Status 404 returned error can't find the container with id cb8e0c5f9c062ea0b3f6b7f9b2faabca2882eb3261c43ef3ca62effa153ea7b9 Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.111747 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f447de3d-38fe-406a-a816-773b3779497a","Type":"ContainerDied","Data":"3e7bcfe0811623bd37688090b8d71586400f2d2cb3c63c225bb74db684b06f29"} Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.111797 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.111802 4849 scope.go:117] "RemoveContainer" containerID="18ae6391048ec982f1e8063c5332f056357f2998d1e82fe55f9139ea3fff2d94" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.112869 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" event={"ID":"91dea7c8-c5a7-4c12-8e7b-8477fededeb5","Type":"ContainerStarted","Data":"cb8e0c5f9c062ea0b3f6b7f9b2faabca2882eb3261c43ef3ca62effa153ea7b9"} Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.144270 4849 scope.go:117] "RemoveContainer" containerID="54e8fa3a413c75c90d57e5ddcc8f6506329abdcd4dc1597bf38927b0149676d2" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.162566 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.170571 4849 scope.go:117] "RemoveContainer" containerID="0411e9b1881e2a70ce08239972fe2503e2f76ee6c10f2e589607cd375f7d16db" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.173087 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.185730 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:01 crc kubenswrapper[4849]: E0320 13:46:01.186190 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-notification-agent" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186211 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-notification-agent" Mar 20 13:46:01 crc kubenswrapper[4849]: E0320 13:46:01.186244 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-central-agent" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186253 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-central-agent" Mar 20 13:46:01 crc kubenswrapper[4849]: E0320 13:46:01.186274 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="sg-core" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186283 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="sg-core" Mar 20 13:46:01 crc kubenswrapper[4849]: E0320 13:46:01.186304 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="proxy-httpd" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186312 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="proxy-httpd" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186544 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-notification-agent" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186576 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="ceilometer-central-agent" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186586 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="sg-core" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.186601 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f447de3d-38fe-406a-a816-773b3779497a" containerName="proxy-httpd" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.188714 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.194386 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.194976 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.195122 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.208881 4849 scope.go:117] "RemoveContainer" containerID="3190e192417242dccdf5cbb501847bc5e3a7c192d4fe9402165862b7c270340f" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.368510 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-config-data\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.368840 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-run-httpd\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.368950 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.369032 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-scripts\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.369128 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8zn\" (UniqueName: \"kubernetes.io/projected/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-kube-api-access-fh8zn\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.369211 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.369309 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-log-httpd\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.470523 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-run-httpd\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.470898 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.470946 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-scripts\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.471080 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-run-httpd\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.471338 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8zn\" (UniqueName: \"kubernetes.io/projected/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-kube-api-access-fh8zn\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.471383 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.471460 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-log-httpd\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.471529 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-config-data\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.472345 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-log-httpd\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.478330 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.479593 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-config-data\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.486636 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.487343 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8zn\" (UniqueName: \"kubernetes.io/projected/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-kube-api-access-fh8zn\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.487670 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-scripts\") pod \"ceilometer-0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.560654 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:01 crc kubenswrapper[4849]: I0320 13:46:01.993249 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.123493 4849 generic.go:334] "Generic (PLEG): container finished" podID="c100d127-fda4-4f86-89d7-64a19be3e8ea" containerID="7ed75db7a040c41a3c2f0b158b085a98f26a5f20c673bec43eada26c5e893852" exitCode=0 Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.123568 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" event={"ID":"c100d127-fda4-4f86-89d7-64a19be3e8ea","Type":"ContainerDied","Data":"7ed75db7a040c41a3c2f0b158b085a98f26a5f20c673bec43eada26c5e893852"} Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.125158 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerStarted","Data":"eb02db2a92e824f07f6d82de8c16025882cdeab2a43342ee6de18c1337de2c05"} Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.917673 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.918032 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.946135 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:46:02 crc kubenswrapper[4849]: I0320 13:46:02.962383 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.048442 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f447de3d-38fe-406a-a816-773b3779497a" path="/var/lib/kubelet/pods/f447de3d-38fe-406a-a816-773b3779497a/volumes" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.133862 4849 generic.go:334] "Generic (PLEG): container finished" podID="91dea7c8-c5a7-4c12-8e7b-8477fededeb5" containerID="2049386afc5409d1f4d192768a19d21afc8294cd5dcc96d60464a0b509e0003a" exitCode=0 Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.133922 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" event={"ID":"91dea7c8-c5a7-4c12-8e7b-8477fededeb5","Type":"ContainerDied","Data":"2049386afc5409d1f4d192768a19d21afc8294cd5dcc96d60464a0b509e0003a"} Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.144575 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerStarted","Data":"a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e"} Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.144732 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.144784 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.479436 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.618685 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-combined-ca-bundle\") pod \"c100d127-fda4-4f86-89d7-64a19be3e8ea\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.618776 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4mg4\" (UniqueName: \"kubernetes.io/projected/c100d127-fda4-4f86-89d7-64a19be3e8ea-kube-api-access-s4mg4\") pod \"c100d127-fda4-4f86-89d7-64a19be3e8ea\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.618870 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-config-data\") pod \"c100d127-fda4-4f86-89d7-64a19be3e8ea\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.618947 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-scripts\") pod \"c100d127-fda4-4f86-89d7-64a19be3e8ea\" (UID: \"c100d127-fda4-4f86-89d7-64a19be3e8ea\") " Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.623639 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c100d127-fda4-4f86-89d7-64a19be3e8ea-kube-api-access-s4mg4" (OuterVolumeSpecName: "kube-api-access-s4mg4") pod "c100d127-fda4-4f86-89d7-64a19be3e8ea" (UID: "c100d127-fda4-4f86-89d7-64a19be3e8ea"). InnerVolumeSpecName "kube-api-access-s4mg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.624404 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-scripts" (OuterVolumeSpecName: "scripts") pod "c100d127-fda4-4f86-89d7-64a19be3e8ea" (UID: "c100d127-fda4-4f86-89d7-64a19be3e8ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.646625 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c100d127-fda4-4f86-89d7-64a19be3e8ea" (UID: "c100d127-fda4-4f86-89d7-64a19be3e8ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.670813 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-config-data" (OuterVolumeSpecName: "config-data") pod "c100d127-fda4-4f86-89d7-64a19be3e8ea" (UID: "c100d127-fda4-4f86-89d7-64a19be3e8ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.721453 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.721490 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.721499 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c100d127-fda4-4f86-89d7-64a19be3e8ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4849]: I0320 13:46:03.721509 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4mg4\" (UniqueName: \"kubernetes.io/projected/c100d127-fda4-4f86-89d7-64a19be3e8ea-kube-api-access-s4mg4\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.162390 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" event={"ID":"c100d127-fda4-4f86-89d7-64a19be3e8ea","Type":"ContainerDied","Data":"f811d008376c3b685ae0b4f0c20929eb5fe9d88ac1c8ddec1417fe13e6663d0b"} Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.162687 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f811d008376c3b685ae0b4f0c20929eb5fe9d88ac1c8ddec1417fe13e6663d0b" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.162731 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bsbt2" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.170915 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerStarted","Data":"b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9"} Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.236666 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:46:04 crc kubenswrapper[4849]: E0320 13:46:04.237252 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c100d127-fda4-4f86-89d7-64a19be3e8ea" containerName="nova-cell0-conductor-db-sync" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.237273 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="c100d127-fda4-4f86-89d7-64a19be3e8ea" containerName="nova-cell0-conductor-db-sync" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.237464 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="c100d127-fda4-4f86-89d7-64a19be3e8ea" containerName="nova-cell0-conductor-db-sync" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.238104 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.272591 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.272800 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-25pzz" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.292194 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.336133 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmf6\" (UniqueName: \"kubernetes.io/projected/6d9f9502-8fe8-4a51-9891-29506dce2581-kube-api-access-mdmf6\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.336170 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9f9502-8fe8-4a51-9891-29506dce2581-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.336209 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9f9502-8fe8-4a51-9891-29506dce2581-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.437461 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmf6\" (UniqueName: \"kubernetes.io/projected/6d9f9502-8fe8-4a51-9891-29506dce2581-kube-api-access-mdmf6\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.437506 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9f9502-8fe8-4a51-9891-29506dce2581-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.437542 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9f9502-8fe8-4a51-9891-29506dce2581-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.451931 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9f9502-8fe8-4a51-9891-29506dce2581-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.452493 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9f9502-8fe8-4a51-9891-29506dce2581-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.458557 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmf6\" (UniqueName: \"kubernetes.io/projected/6d9f9502-8fe8-4a51-9891-29506dce2581-kube-api-access-mdmf6\") pod \"nova-cell0-conductor-0\" (UID: \"6d9f9502-8fe8-4a51-9891-29506dce2581\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.544226 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.591314 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.627245 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.627718 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.640454 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln7tj\" (UniqueName: \"kubernetes.io/projected/91dea7c8-c5a7-4c12-8e7b-8477fededeb5-kube-api-access-ln7tj\") pod \"91dea7c8-c5a7-4c12-8e7b-8477fededeb5\" (UID: \"91dea7c8-c5a7-4c12-8e7b-8477fededeb5\") " Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.651581 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91dea7c8-c5a7-4c12-8e7b-8477fededeb5-kube-api-access-ln7tj" (OuterVolumeSpecName: "kube-api-access-ln7tj") pod "91dea7c8-c5a7-4c12-8e7b-8477fededeb5" (UID: "91dea7c8-c5a7-4c12-8e7b-8477fededeb5"). InnerVolumeSpecName "kube-api-access-ln7tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.674018 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.687695 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:04 crc kubenswrapper[4849]: I0320 13:46:04.743936 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln7tj\" (UniqueName: \"kubernetes.io/projected/91dea7c8-c5a7-4c12-8e7b-8477fededeb5-kube-api-access-ln7tj\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.055925 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.183266 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6d9f9502-8fe8-4a51-9891-29506dce2581","Type":"ContainerStarted","Data":"e8ac5115f563b9121025b1a4ddad13e9a58a4c4accc1d93f95a02849683d97fa"} Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.185323 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" event={"ID":"91dea7c8-c5a7-4c12-8e7b-8477fededeb5","Type":"ContainerDied","Data":"cb8e0c5f9c062ea0b3f6b7f9b2faabca2882eb3261c43ef3ca62effa153ea7b9"} Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.185361 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.185378 4849 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.185372 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb8e0c5f9c062ea0b3f6b7f9b2faabca2882eb3261c43ef3ca62effa153ea7b9" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.185382 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-4pc5f" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.186087 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.186114 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.287305 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.288432 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.637312 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-bwrb8"] Mar 20 13:46:05 crc kubenswrapper[4849]: I0320 13:46:05.642234 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-bwrb8"] Mar 20 13:46:06 crc kubenswrapper[4849]: I0320 13:46:06.195660 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"6d9f9502-8fe8-4a51-9891-29506dce2581","Type":"ContainerStarted","Data":"692d34f7ccd21737e4a6c015f5d7b4f77519134776111641f4d5c47b1e966e81"} Mar 20 13:46:06 crc kubenswrapper[4849]: I0320 13:46:06.196860 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:06 crc kubenswrapper[4849]: I0320 13:46:06.199746 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerStarted","Data":"abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c"} Mar 20 13:46:06 crc kubenswrapper[4849]: I0320 13:46:06.219395 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.21937802 podStartE2EDuration="2.21937802s" podCreationTimestamp="2026-03-20 13:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:06.216466024 +0000 UTC m=+1315.894189419" watchObservedRunningTime="2026-03-20 13:46:06.21937802 +0000 UTC m=+1315.897101415" Mar 20 13:46:07 crc kubenswrapper[4849]: I0320 13:46:07.047513 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0" path="/var/lib/kubelet/pods/14cb3d0f-a0ee-4565-a3a5-b2ffa27586d0/volumes" Mar 20 13:46:07 crc kubenswrapper[4849]: I0320 13:46:07.088014 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:07 crc kubenswrapper[4849]: I0320 13:46:07.147379 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:08 crc kubenswrapper[4849]: I0320 13:46:08.231506 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerStarted","Data":"12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1"} Mar 20 13:46:08 crc kubenswrapper[4849]: I0320 13:46:08.231946 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:08 crc kubenswrapper[4849]: I0320 13:46:08.256092 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.644309045 podStartE2EDuration="7.256070135s" podCreationTimestamp="2026-03-20 13:46:01 +0000 UTC" firstStartedPulling="2026-03-20 13:46:01.995811729 +0000 UTC m=+1311.673535124" lastFinishedPulling="2026-03-20 13:46:07.607572819 +0000 UTC m=+1317.285296214" observedRunningTime="2026-03-20 13:46:08.252358488 +0000 UTC m=+1317.930081903" watchObservedRunningTime="2026-03-20 13:46:08.256070135 +0000 UTC m=+1317.933793530" Mar 20 13:46:14 crc kubenswrapper[4849]: I0320 13:46:14.615467 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.132528 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bxjmr"] Mar 20 13:46:15 crc kubenswrapper[4849]: E0320 13:46:15.132880 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91dea7c8-c5a7-4c12-8e7b-8477fededeb5" containerName="oc" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.132893 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="91dea7c8-c5a7-4c12-8e7b-8477fededeb5" containerName="oc" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.133058 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="91dea7c8-c5a7-4c12-8e7b-8477fededeb5" containerName="oc" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.133690 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.135874 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.141168 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.159637 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxjmr"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.334239 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.334278 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4h7z\" (UniqueName: \"kubernetes.io/projected/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-kube-api-access-w4h7z\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.334392 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-scripts\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.334480 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.334570 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-config-data\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.338223 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.340047 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.356799 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.389510 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.399069 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.402557 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.407009 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.436678 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.436733 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-config-data\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.436779 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfmb\" (UniqueName: \"kubernetes.io/projected/fcb1dd05-5332-4f88-8d33-b456d59d00d0-kube-api-access-whfmb\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.436896 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-config-data\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.436944 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2474fc3-733d-4664-9282-bda5d96a817d-logs\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.436985 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.437022 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.437046 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-config-data\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.437084 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4h7z\" (UniqueName: \"kubernetes.io/projected/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-kube-api-access-w4h7z\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.437120 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57n75\" (UniqueName: \"kubernetes.io/projected/e2474fc3-733d-4664-9282-bda5d96a817d-kube-api-access-57n75\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.437150 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-scripts\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.445708 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.458522 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-config-data\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.459041 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.460053 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.464444 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-scripts\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.464793 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.474153 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.482335 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4h7z\" (UniqueName: \"kubernetes.io/projected/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-kube-api-access-w4h7z\") pod \"nova-cell0-cell-mapping-bxjmr\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538247 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-config-data\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538296 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-config-data\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538321 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2474fc3-733d-4664-9282-bda5d96a817d-logs\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538343 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538368 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538400 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538417 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-config-data\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538441 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgt94\" (UniqueName: \"kubernetes.io/projected/da288402-5077-4a23-b2f9-893b1ce9d894-kube-api-access-cgt94\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538466 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57n75\" (UniqueName: \"kubernetes.io/projected/e2474fc3-733d-4664-9282-bda5d96a817d-kube-api-access-57n75\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538509 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da288402-5077-4a23-b2f9-893b1ce9d894-logs\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.538538 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfmb\" (UniqueName: \"kubernetes.io/projected/fcb1dd05-5332-4f88-8d33-b456d59d00d0-kube-api-access-whfmb\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.540211 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2474fc3-733d-4664-9282-bda5d96a817d-logs\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.547348 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-config-data\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.551551 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.560623 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-config-data\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.574613 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.587034 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfmb\" (UniqueName: \"kubernetes.io/projected/fcb1dd05-5332-4f88-8d33-b456d59d00d0-kube-api-access-whfmb\") pod \"nova-scheduler-0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.598830 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.600270 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.607122 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.624499 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57n75\" (UniqueName: \"kubernetes.io/projected/e2474fc3-733d-4664-9282-bda5d96a817d-kube-api-access-57n75\") pod \"nova-api-0\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.626867 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-t4j65"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.628348 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.642582 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-config-data\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.642629 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.642682 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgt94\" (UniqueName: \"kubernetes.io/projected/da288402-5077-4a23-b2f9-893b1ce9d894-kube-api-access-cgt94\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.642737 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da288402-5077-4a23-b2f9-893b1ce9d894-logs\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.643259 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da288402-5077-4a23-b2f9-893b1ce9d894-logs\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.644069 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.651043 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.652460 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.653747 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-config-data\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.677350 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-t4j65"] Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.684784 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgt94\" (UniqueName: \"kubernetes.io/projected/da288402-5077-4a23-b2f9-893b1ce9d894-kube-api-access-cgt94\") pod \"nova-metadata-0\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.719173 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.720678 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.745519 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.745563 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-config\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.745597 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.745621 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntfj4\" (UniqueName: \"kubernetes.io/projected/cfb1017b-3277-4667-aade-d2852d0ddd0e-kube-api-access-ntfj4\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.746172 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.746273 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9rg\" (UniqueName: \"kubernetes.io/projected/8e76c274-ce20-458a-a78e-84f736089dd1-kube-api-access-sr9rg\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.746302 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-svc\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.746945 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.749086 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.751801 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851310 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9rg\" (UniqueName: \"kubernetes.io/projected/8e76c274-ce20-458a-a78e-84f736089dd1-kube-api-access-sr9rg\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851363 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-svc\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851443 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851465 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851486 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851508 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-config\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851530 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851551 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntfj4\" (UniqueName: \"kubernetes.io/projected/cfb1017b-3277-4667-aade-d2852d0ddd0e-kube-api-access-ntfj4\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.851579 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.852885 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.853176 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-config\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.854199 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.854238 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-svc\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.855538 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.874228 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.874246 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.874518 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9rg\" (UniqueName: \"kubernetes.io/projected/8e76c274-ce20-458a-a78e-84f736089dd1-kube-api-access-sr9rg\") pod \"dnsmasq-dns-757b4f8459-t4j65\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:15 crc kubenswrapper[4849]: I0320 13:46:15.877527 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntfj4\" (UniqueName: \"kubernetes.io/projected/cfb1017b-3277-4667-aade-d2852d0ddd0e-kube-api-access-ntfj4\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.041244 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.057629 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.196132 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccqkc"] Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.197483 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.199682 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.199921 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.241501 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.273480 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-config-data\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.274458 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrl5\" (UniqueName: \"kubernetes.io/projected/cfff4046-20be-4224-8bc7-0741b2fd01a7-kube-api-access-nkrl5\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.273758 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccqkc"] Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.274907 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-scripts\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.275020 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.308976 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2474fc3-733d-4664-9282-bda5d96a817d","Type":"ContainerStarted","Data":"b9e75e3b543052eb83935351009479896568b039712f4c206de88c9730f98d48"} Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.336394 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:16 crc kubenswrapper[4849]: W0320 13:46:16.345711 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcb1dd05_5332_4f88_8d33_b456d59d00d0.slice/crio-05b2b4a97c34ef32ad4162d184b7a6e89f5c4ba523dfe0c60fb83fb899b092e6 WatchSource:0}: Error finding container 05b2b4a97c34ef32ad4162d184b7a6e89f5c4ba523dfe0c60fb83fb899b092e6: Status 404 returned error can't find the container with id 05b2b4a97c34ef32ad4162d184b7a6e89f5c4ba523dfe0c60fb83fb899b092e6 Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.347239 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:16 crc kubenswrapper[4849]: W0320 13:46:16.348016 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda288402_5077_4a23_b2f9_893b1ce9d894.slice/crio-4cc0429cfa53bac079840e9eaf8eafc93264d53c2e65c94365e7fb6f13da4871 WatchSource:0}: Error finding container 4cc0429cfa53bac079840e9eaf8eafc93264d53c2e65c94365e7fb6f13da4871: Status 404 returned error can't find the container with id 4cc0429cfa53bac079840e9eaf8eafc93264d53c2e65c94365e7fb6f13da4871 Mar 20 13:46:16 crc kubenswrapper[4849]: W0320 13:46:16.361192 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce9903d0_8cc8_4bce_99da_96d1e8657e2a.slice/crio-5403d455ad5d05aa0ad56fa09ddbfb4dd90c509c95e79046919beb1855bf1701 WatchSource:0}: Error finding container 5403d455ad5d05aa0ad56fa09ddbfb4dd90c509c95e79046919beb1855bf1701: Status 404 returned error can't find the container with id 5403d455ad5d05aa0ad56fa09ddbfb4dd90c509c95e79046919beb1855bf1701 Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.368131 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxjmr"] Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.383117 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkrl5\" (UniqueName: \"kubernetes.io/projected/cfff4046-20be-4224-8bc7-0741b2fd01a7-kube-api-access-nkrl5\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.383355 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-scripts\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.383448 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.383496 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-config-data\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.389005 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-scripts\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.390291 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.396665 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-config-data\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.398695 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkrl5\" (UniqueName: \"kubernetes.io/projected/cfff4046-20be-4224-8bc7-0741b2fd01a7-kube-api-access-nkrl5\") pod \"nova-cell1-conductor-db-sync-ccqkc\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.522095 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.594866 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:16 crc kubenswrapper[4849]: W0320 13:46:16.616602 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb1017b_3277_4667_aade_d2852d0ddd0e.slice/crio-dc928e04393f71357e364735b2de3e3d45a91464260d00da9e7ed16df920a836 WatchSource:0}: Error finding container dc928e04393f71357e364735b2de3e3d45a91464260d00da9e7ed16df920a836: Status 404 returned error can't find the container with id dc928e04393f71357e364735b2de3e3d45a91464260d00da9e7ed16df920a836 Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.676580 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-t4j65"] Mar 20 13:46:16 crc kubenswrapper[4849]: I0320 13:46:16.994313 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccqkc"] Mar 20 13:46:17 crc kubenswrapper[4849]: W0320 13:46:17.014301 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfff4046_20be_4224_8bc7_0741b2fd01a7.slice/crio-3ec47cb3dfb5232960a138ed3f4890eb0b9157b144cb77abb42244cfa3cb2282 WatchSource:0}: Error finding container 3ec47cb3dfb5232960a138ed3f4890eb0b9157b144cb77abb42244cfa3cb2282: Status 404 returned error can't find the container with id 3ec47cb3dfb5232960a138ed3f4890eb0b9157b144cb77abb42244cfa3cb2282 Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.320289 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" event={"ID":"cfff4046-20be-4224-8bc7-0741b2fd01a7","Type":"ContainerStarted","Data":"662bd500bcc5017ab0e41959fcd1d810385d49d194b351fc9a42280103fe4006"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.320629 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" event={"ID":"cfff4046-20be-4224-8bc7-0741b2fd01a7","Type":"ContainerStarted","Data":"3ec47cb3dfb5232960a138ed3f4890eb0b9157b144cb77abb42244cfa3cb2282"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.322390 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da288402-5077-4a23-b2f9-893b1ce9d894","Type":"ContainerStarted","Data":"4cc0429cfa53bac079840e9eaf8eafc93264d53c2e65c94365e7fb6f13da4871"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.325898 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfb1017b-3277-4667-aade-d2852d0ddd0e","Type":"ContainerStarted","Data":"dc928e04393f71357e364735b2de3e3d45a91464260d00da9e7ed16df920a836"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.328327 4849 generic.go:334] "Generic (PLEG): container finished" podID="8e76c274-ce20-458a-a78e-84f736089dd1" containerID="ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1" exitCode=0 Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.328612 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" event={"ID":"8e76c274-ce20-458a-a78e-84f736089dd1","Type":"ContainerDied","Data":"ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.328641 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" event={"ID":"8e76c274-ce20-458a-a78e-84f736089dd1","Type":"ContainerStarted","Data":"81b8edb16c14aafab2290be1123ecf4284bd146f5d05164db422fa0bf53f3707"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.332252 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcb1dd05-5332-4f88-8d33-b456d59d00d0","Type":"ContainerStarted","Data":"05b2b4a97c34ef32ad4162d184b7a6e89f5c4ba523dfe0c60fb83fb899b092e6"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.342023 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxjmr" event={"ID":"ce9903d0-8cc8-4bce-99da-96d1e8657e2a","Type":"ContainerStarted","Data":"b2e1de513c310954fd87fb1ed6adf147981f9125667ee41d366af6f0e226eacd"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.342072 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxjmr" event={"ID":"ce9903d0-8cc8-4bce-99da-96d1e8657e2a","Type":"ContainerStarted","Data":"5403d455ad5d05aa0ad56fa09ddbfb4dd90c509c95e79046919beb1855bf1701"} Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.359494 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" podStartSLOduration=1.359436584 podStartE2EDuration="1.359436584s" podCreationTimestamp="2026-03-20 13:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:17.333525049 +0000 UTC m=+1327.011248464" watchObservedRunningTime="2026-03-20 13:46:17.359436584 +0000 UTC m=+1327.037159999" Mar 20 13:46:17 crc kubenswrapper[4849]: I0320 13:46:17.387368 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bxjmr" podStartSLOduration=2.387352111 podStartE2EDuration="2.387352111s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:17.379979779 +0000 UTC m=+1327.057703174" watchObservedRunningTime="2026-03-20 13:46:17.387352111 +0000 UTC m=+1327.065075506" Mar 20 13:46:19 crc kubenswrapper[4849]: I0320 13:46:19.131949 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:19 crc kubenswrapper[4849]: I0320 13:46:19.145507 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.384901 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfb1017b-3277-4667-aade-d2852d0ddd0e","Type":"ContainerStarted","Data":"9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.384960 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cfb1017b-3277-4667-aade-d2852d0ddd0e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8" gracePeriod=30 Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.387719 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" event={"ID":"8e76c274-ce20-458a-a78e-84f736089dd1","Type":"ContainerStarted","Data":"fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.388476 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.392723 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcb1dd05-5332-4f88-8d33-b456d59d00d0","Type":"ContainerStarted","Data":"cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.395624 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2474fc3-733d-4664-9282-bda5d96a817d","Type":"ContainerStarted","Data":"915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.395662 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2474fc3-733d-4664-9282-bda5d96a817d","Type":"ContainerStarted","Data":"3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.405302 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da288402-5077-4a23-b2f9-893b1ce9d894","Type":"ContainerStarted","Data":"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.405381 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da288402-5077-4a23-b2f9-893b1ce9d894","Type":"ContainerStarted","Data":"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24"} Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.405427 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-log" containerID="cri-o://983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24" gracePeriod=30 Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.405468 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-metadata" containerID="cri-o://db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21" gracePeriod=30 Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.406288 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6981684980000002 podStartE2EDuration="5.406275226s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:16.626398865 +0000 UTC m=+1326.304122270" lastFinishedPulling="2026-03-20 13:46:19.334505603 +0000 UTC m=+1329.012228998" observedRunningTime="2026-03-20 13:46:20.398313849 +0000 UTC m=+1330.076037244" watchObservedRunningTime="2026-03-20 13:46:20.406275226 +0000 UTC m=+1330.083998621" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.432744 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.339501814 podStartE2EDuration="5.432717505s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:16.243036357 +0000 UTC m=+1325.920759752" lastFinishedPulling="2026-03-20 13:46:19.336252048 +0000 UTC m=+1329.013975443" observedRunningTime="2026-03-20 13:46:20.424108121 +0000 UTC m=+1330.101831536" watchObservedRunningTime="2026-03-20 13:46:20.432717505 +0000 UTC m=+1330.110440930" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.448618 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.466259306 podStartE2EDuration="5.448594909s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:16.348742021 +0000 UTC m=+1326.026465416" lastFinishedPulling="2026-03-20 13:46:19.331077624 +0000 UTC m=+1329.008801019" observedRunningTime="2026-03-20 13:46:20.441634997 +0000 UTC m=+1330.119358412" watchObservedRunningTime="2026-03-20 13:46:20.448594909 +0000 UTC m=+1330.126318304" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.467341 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" podStartSLOduration=5.467323697 podStartE2EDuration="5.467323697s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:20.46207374 +0000 UTC m=+1330.139797155" watchObservedRunningTime="2026-03-20 13:46:20.467323697 +0000 UTC m=+1330.145047092" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.483708 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.50863283 podStartE2EDuration="5.483686343s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:16.354692996 +0000 UTC m=+1326.032416391" lastFinishedPulling="2026-03-20 13:46:19.329746509 +0000 UTC m=+1329.007469904" observedRunningTime="2026-03-20 13:46:20.477743668 +0000 UTC m=+1330.155467063" watchObservedRunningTime="2026-03-20 13:46:20.483686343 +0000 UTC m=+1330.161409738" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.721989 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:46:20 crc kubenswrapper[4849]: I0320 13:46:20.993627 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.054329 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.099333 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgt94\" (UniqueName: \"kubernetes.io/projected/da288402-5077-4a23-b2f9-893b1ce9d894-kube-api-access-cgt94\") pod \"da288402-5077-4a23-b2f9-893b1ce9d894\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.099387 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-config-data\") pod \"da288402-5077-4a23-b2f9-893b1ce9d894\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.099626 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-combined-ca-bundle\") pod \"da288402-5077-4a23-b2f9-893b1ce9d894\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.099646 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da288402-5077-4a23-b2f9-893b1ce9d894-logs\") pod \"da288402-5077-4a23-b2f9-893b1ce9d894\" (UID: \"da288402-5077-4a23-b2f9-893b1ce9d894\") " Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.100746 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da288402-5077-4a23-b2f9-893b1ce9d894-logs" (OuterVolumeSpecName: "logs") pod "da288402-5077-4a23-b2f9-893b1ce9d894" (UID: "da288402-5077-4a23-b2f9-893b1ce9d894"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.105124 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da288402-5077-4a23-b2f9-893b1ce9d894-kube-api-access-cgt94" (OuterVolumeSpecName: "kube-api-access-cgt94") pod "da288402-5077-4a23-b2f9-893b1ce9d894" (UID: "da288402-5077-4a23-b2f9-893b1ce9d894"). InnerVolumeSpecName "kube-api-access-cgt94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.126149 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-config-data" (OuterVolumeSpecName: "config-data") pod "da288402-5077-4a23-b2f9-893b1ce9d894" (UID: "da288402-5077-4a23-b2f9-893b1ce9d894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.133380 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da288402-5077-4a23-b2f9-893b1ce9d894" (UID: "da288402-5077-4a23-b2f9-893b1ce9d894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.202452 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.202484 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da288402-5077-4a23-b2f9-893b1ce9d894-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.202495 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgt94\" (UniqueName: \"kubernetes.io/projected/da288402-5077-4a23-b2f9-893b1ce9d894-kube-api-access-cgt94\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.202505 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da288402-5077-4a23-b2f9-893b1ce9d894-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.415521 4849 generic.go:334] "Generic (PLEG): container finished" podID="da288402-5077-4a23-b2f9-893b1ce9d894" containerID="db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21" exitCode=0 Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.415544 4849 generic.go:334] "Generic (PLEG): container finished" podID="da288402-5077-4a23-b2f9-893b1ce9d894" containerID="983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24" exitCode=143 Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.416169 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.418884 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da288402-5077-4a23-b2f9-893b1ce9d894","Type":"ContainerDied","Data":"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21"} Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.418961 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da288402-5077-4a23-b2f9-893b1ce9d894","Type":"ContainerDied","Data":"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24"} Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.418974 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da288402-5077-4a23-b2f9-893b1ce9d894","Type":"ContainerDied","Data":"4cc0429cfa53bac079840e9eaf8eafc93264d53c2e65c94365e7fb6f13da4871"} Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.418990 4849 scope.go:117] "RemoveContainer" containerID="db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.455918 4849 scope.go:117] "RemoveContainer" containerID="983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.486172 4849 scope.go:117] "RemoveContainer" containerID="db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21" Mar 20 13:46:21 crc kubenswrapper[4849]: E0320 13:46:21.488257 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21\": container with ID starting with db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21 not found: ID does not exist" containerID="db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.488321 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21"} err="failed to get container status \"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21\": rpc error: code = NotFound desc = could not find container \"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21\": container with ID starting with db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21 not found: ID does not exist" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.488351 4849 scope.go:117] "RemoveContainer" containerID="983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24" Mar 20 13:46:21 crc kubenswrapper[4849]: E0320 13:46:21.488880 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24\": container with ID starting with 983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24 not found: ID does not exist" containerID="983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.488922 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24"} err="failed to get container status \"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24\": rpc error: code = NotFound desc = could not find container \"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24\": container with ID starting with 983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24 not found: ID does not exist" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.488943 4849 scope.go:117] "RemoveContainer" containerID="db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.491863 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21"} err="failed to get container status \"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21\": rpc error: code = NotFound desc = could not find container \"db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21\": container with ID starting with db76416f61c0a2b39c5e550a16eb09c4ffdc98488320a061ee8c5cdeb05d3a21 not found: ID does not exist" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.491919 4849 scope.go:117] "RemoveContainer" containerID="983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.492488 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24"} err="failed to get container status \"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24\": rpc error: code = NotFound desc = could not find container \"983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24\": container with ID starting with 983d062dccf13d0dcb791a6f7ef68509dfd3109c22f40ec1672406025a273f24 not found: ID does not exist" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.512931 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.534268 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.543474 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:21 crc kubenswrapper[4849]: E0320 13:46:21.544088 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-metadata" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.544104 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-metadata" Mar 20 13:46:21 crc kubenswrapper[4849]: E0320 13:46:21.544143 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-log" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.544149 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-log" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.544377 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-log" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.544396 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" containerName="nova-metadata-metadata" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.545577 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.549717 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.549796 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.556666 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.713298 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-logs\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.713629 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmv6x\" (UniqueName: \"kubernetes.io/projected/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-kube-api-access-zmv6x\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.713677 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.713702 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.713872 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-config-data\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.816001 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-logs\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.816086 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmv6x\" (UniqueName: \"kubernetes.io/projected/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-kube-api-access-zmv6x\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.816133 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.816160 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.816214 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-config-data\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.816630 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-logs\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.820206 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.820704 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.821768 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-config-data\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.831974 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmv6x\" (UniqueName: \"kubernetes.io/projected/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-kube-api-access-zmv6x\") pod \"nova-metadata-0\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " pod="openstack/nova-metadata-0" Mar 20 13:46:21 crc kubenswrapper[4849]: I0320 13:46:21.869631 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:22 crc kubenswrapper[4849]: I0320 13:46:22.349557 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:22 crc kubenswrapper[4849]: I0320 13:46:22.426189 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a27a2635-ce6e-4c86-bd1b-310fe76fba4f","Type":"ContainerStarted","Data":"e6af235109decc5eb71e24473f2ec1a145275682c2d9860069ec491d455d5338"} Mar 20 13:46:23 crc kubenswrapper[4849]: I0320 13:46:23.089359 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da288402-5077-4a23-b2f9-893b1ce9d894" path="/var/lib/kubelet/pods/da288402-5077-4a23-b2f9-893b1ce9d894/volumes" Mar 20 13:46:23 crc kubenswrapper[4849]: I0320 13:46:23.440840 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a27a2635-ce6e-4c86-bd1b-310fe76fba4f","Type":"ContainerStarted","Data":"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583"} Mar 20 13:46:23 crc kubenswrapper[4849]: I0320 13:46:23.440892 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a27a2635-ce6e-4c86-bd1b-310fe76fba4f","Type":"ContainerStarted","Data":"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e"} Mar 20 13:46:24 crc kubenswrapper[4849]: I0320 13:46:24.451771 4849 generic.go:334] "Generic (PLEG): container finished" podID="ce9903d0-8cc8-4bce-99da-96d1e8657e2a" containerID="b2e1de513c310954fd87fb1ed6adf147981f9125667ee41d366af6f0e226eacd" exitCode=0 Mar 20 13:46:24 crc kubenswrapper[4849]: I0320 13:46:24.451953 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxjmr" event={"ID":"ce9903d0-8cc8-4bce-99da-96d1e8657e2a","Type":"ContainerDied","Data":"b2e1de513c310954fd87fb1ed6adf147981f9125667ee41d366af6f0e226eacd"} Mar 20 13:46:24 crc kubenswrapper[4849]: I0320 13:46:24.454767 4849 generic.go:334] "Generic (PLEG): container finished" podID="cfff4046-20be-4224-8bc7-0741b2fd01a7" containerID="662bd500bcc5017ab0e41959fcd1d810385d49d194b351fc9a42280103fe4006" exitCode=0 Mar 20 13:46:24 crc kubenswrapper[4849]: I0320 13:46:24.454793 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" event={"ID":"cfff4046-20be-4224-8bc7-0741b2fd01a7","Type":"ContainerDied","Data":"662bd500bcc5017ab0e41959fcd1d810385d49d194b351fc9a42280103fe4006"} Mar 20 13:46:24 crc kubenswrapper[4849]: I0320 13:46:24.476917 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.476900622 podStartE2EDuration="3.476900622s" podCreationTimestamp="2026-03-20 13:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:23.473179651 +0000 UTC m=+1333.150903056" watchObservedRunningTime="2026-03-20 13:46:24.476900622 +0000 UTC m=+1334.154624007" Mar 20 13:46:25 crc kubenswrapper[4849]: I0320 13:46:25.653976 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:46:25 crc kubenswrapper[4849]: I0320 13:46:25.654026 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:46:25 crc kubenswrapper[4849]: I0320 13:46:25.721240 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:46:25 crc kubenswrapper[4849]: I0320 13:46:25.751373 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:46:25 crc kubenswrapper[4849]: I0320 13:46:25.918952 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:25 crc kubenswrapper[4849]: I0320 13:46:25.935971 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015373 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-config-data\") pod \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015414 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4h7z\" (UniqueName: \"kubernetes.io/projected/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-kube-api-access-w4h7z\") pod \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015509 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-combined-ca-bundle\") pod \"cfff4046-20be-4224-8bc7-0741b2fd01a7\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015530 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-scripts\") pod \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015546 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-combined-ca-bundle\") pod \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\" (UID: \"ce9903d0-8cc8-4bce-99da-96d1e8657e2a\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015578 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-scripts\") pod \"cfff4046-20be-4224-8bc7-0741b2fd01a7\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015637 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkrl5\" (UniqueName: \"kubernetes.io/projected/cfff4046-20be-4224-8bc7-0741b2fd01a7-kube-api-access-nkrl5\") pod \"cfff4046-20be-4224-8bc7-0741b2fd01a7\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.015662 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-config-data\") pod \"cfff4046-20be-4224-8bc7-0741b2fd01a7\" (UID: \"cfff4046-20be-4224-8bc7-0741b2fd01a7\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.022245 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-scripts" (OuterVolumeSpecName: "scripts") pod "ce9903d0-8cc8-4bce-99da-96d1e8657e2a" (UID: "ce9903d0-8cc8-4bce-99da-96d1e8657e2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.023009 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-kube-api-access-w4h7z" (OuterVolumeSpecName: "kube-api-access-w4h7z") pod "ce9903d0-8cc8-4bce-99da-96d1e8657e2a" (UID: "ce9903d0-8cc8-4bce-99da-96d1e8657e2a"). InnerVolumeSpecName "kube-api-access-w4h7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.024982 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-scripts" (OuterVolumeSpecName: "scripts") pod "cfff4046-20be-4224-8bc7-0741b2fd01a7" (UID: "cfff4046-20be-4224-8bc7-0741b2fd01a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.037121 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfff4046-20be-4224-8bc7-0741b2fd01a7-kube-api-access-nkrl5" (OuterVolumeSpecName: "kube-api-access-nkrl5") pod "cfff4046-20be-4224-8bc7-0741b2fd01a7" (UID: "cfff4046-20be-4224-8bc7-0741b2fd01a7"). InnerVolumeSpecName "kube-api-access-nkrl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.046375 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-config-data" (OuterVolumeSpecName: "config-data") pod "cfff4046-20be-4224-8bc7-0741b2fd01a7" (UID: "cfff4046-20be-4224-8bc7-0741b2fd01a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.047519 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfff4046-20be-4224-8bc7-0741b2fd01a7" (UID: "cfff4046-20be-4224-8bc7-0741b2fd01a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.053987 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce9903d0-8cc8-4bce-99da-96d1e8657e2a" (UID: "ce9903d0-8cc8-4bce-99da-96d1e8657e2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.054465 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-config-data" (OuterVolumeSpecName: "config-data") pod "ce9903d0-8cc8-4bce-99da-96d1e8657e2a" (UID: "ce9903d0-8cc8-4bce-99da-96d1e8657e2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.060708 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120109 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120457 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120490 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120507 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120518 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkrl5\" (UniqueName: \"kubernetes.io/projected/cfff4046-20be-4224-8bc7-0741b2fd01a7-kube-api-access-nkrl5\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120530 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfff4046-20be-4224-8bc7-0741b2fd01a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120541 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.120554 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4h7z\" (UniqueName: \"kubernetes.io/projected/ce9903d0-8cc8-4bce-99da-96d1e8657e2a-kube-api-access-w4h7z\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.150033 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-l8pb4"] Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.150490 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerName="dnsmasq-dns" containerID="cri-o://371b262defdab34bf7aa2f4f1abe9721f0b0f3fca163eace16b6672e2f381e1c" gracePeriod=10 Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.477960 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bxjmr" event={"ID":"ce9903d0-8cc8-4bce-99da-96d1e8657e2a","Type":"ContainerDied","Data":"5403d455ad5d05aa0ad56fa09ddbfb4dd90c509c95e79046919beb1855bf1701"} Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.478002 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5403d455ad5d05aa0ad56fa09ddbfb4dd90c509c95e79046919beb1855bf1701" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.478080 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bxjmr" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.501891 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" event={"ID":"cfff4046-20be-4224-8bc7-0741b2fd01a7","Type":"ContainerDied","Data":"3ec47cb3dfb5232960a138ed3f4890eb0b9157b144cb77abb42244cfa3cb2282"} Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.501920 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ccqkc" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.501932 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ec47cb3dfb5232960a138ed3f4890eb0b9157b144cb77abb42244cfa3cb2282" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.525632 4849 generic.go:334] "Generic (PLEG): container finished" podID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerID="371b262defdab34bf7aa2f4f1abe9721f0b0f3fca163eace16b6672e2f381e1c" exitCode=0 Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.526496 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" event={"ID":"e33ed079-a0fe-4167-98b1-25339aaf90d2","Type":"ContainerDied","Data":"371b262defdab34bf7aa2f4f1abe9721f0b0f3fca163eace16b6672e2f381e1c"} Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.575578 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:46:26 crc kubenswrapper[4849]: E0320 13:46:26.576161 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfff4046-20be-4224-8bc7-0741b2fd01a7" containerName="nova-cell1-conductor-db-sync" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.576174 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfff4046-20be-4224-8bc7-0741b2fd01a7" containerName="nova-cell1-conductor-db-sync" Mar 20 13:46:26 crc kubenswrapper[4849]: E0320 13:46:26.576192 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9903d0-8cc8-4bce-99da-96d1e8657e2a" containerName="nova-manage" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.576198 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9903d0-8cc8-4bce-99da-96d1e8657e2a" containerName="nova-manage" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.576791 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9903d0-8cc8-4bce-99da-96d1e8657e2a" containerName="nova-manage" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.576831 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfff4046-20be-4224-8bc7-0741b2fd01a7" containerName="nova-cell1-conductor-db-sync" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.577896 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.583178 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.591403 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.594516 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.663264 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.684640 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.684883 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-log" containerID="cri-o://3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a" gracePeriod=30 Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.685042 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-api" containerID="cri-o://915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a" gracePeriod=30 Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.699345 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.699654 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.732163 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.732415 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.732879 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj6nj\" (UniqueName: \"kubernetes.io/projected/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-kube-api-access-bj6nj\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.744763 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.766207 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.766410 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-log" containerID="cri-o://433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e" gracePeriod=30 Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.766573 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-metadata" containerID="cri-o://2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583" gracePeriod=30 Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.833952 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-nb\") pod \"e33ed079-a0fe-4167-98b1-25339aaf90d2\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834298 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-config\") pod \"e33ed079-a0fe-4167-98b1-25339aaf90d2\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834367 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-svc\") pod \"e33ed079-a0fe-4167-98b1-25339aaf90d2\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834401 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-swift-storage-0\") pod \"e33ed079-a0fe-4167-98b1-25339aaf90d2\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834422 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcbmw\" (UniqueName: \"kubernetes.io/projected/e33ed079-a0fe-4167-98b1-25339aaf90d2-kube-api-access-tcbmw\") pod \"e33ed079-a0fe-4167-98b1-25339aaf90d2\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834448 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-sb\") pod \"e33ed079-a0fe-4167-98b1-25339aaf90d2\" (UID: \"e33ed079-a0fe-4167-98b1-25339aaf90d2\") " Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834832 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.834913 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.835001 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj6nj\" (UniqueName: \"kubernetes.io/projected/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-kube-api-access-bj6nj\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.845313 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e33ed079-a0fe-4167-98b1-25339aaf90d2-kube-api-access-tcbmw" (OuterVolumeSpecName: "kube-api-access-tcbmw") pod "e33ed079-a0fe-4167-98b1-25339aaf90d2" (UID: "e33ed079-a0fe-4167-98b1-25339aaf90d2"). InnerVolumeSpecName "kube-api-access-tcbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.845893 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.859447 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj6nj\" (UniqueName: \"kubernetes.io/projected/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-kube-api-access-bj6nj\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.865581 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.888628 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-config" (OuterVolumeSpecName: "config") pod "e33ed079-a0fe-4167-98b1-25339aaf90d2" (UID: "e33ed079-a0fe-4167-98b1-25339aaf90d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.901572 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e33ed079-a0fe-4167-98b1-25339aaf90d2" (UID: "e33ed079-a0fe-4167-98b1-25339aaf90d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.901675 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e33ed079-a0fe-4167-98b1-25339aaf90d2" (UID: "e33ed079-a0fe-4167-98b1-25339aaf90d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.911961 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e33ed079-a0fe-4167-98b1-25339aaf90d2" (UID: "e33ed079-a0fe-4167-98b1-25339aaf90d2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.916876 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e33ed079-a0fe-4167-98b1-25339aaf90d2" (UID: "e33ed079-a0fe-4167-98b1-25339aaf90d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.924842 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.937337 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.937363 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcbmw\" (UniqueName: \"kubernetes.io/projected/e33ed079-a0fe-4167-98b1-25339aaf90d2-kube-api-access-tcbmw\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.937378 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.937387 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.937396 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:26 crc kubenswrapper[4849]: I0320 13:46:26.937404 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33ed079-a0fe-4167-98b1-25339aaf90d2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.381764 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.426621 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:46:27 crc kubenswrapper[4849]: W0320 13:46:27.428845 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c1b8d9e_0e94_4c17_8f58_3b9f95c54d75.slice/crio-ef4bd94e2402ef66cce88f68925475d02693a3586465542da9d734814453a7b1 WatchSource:0}: Error finding container ef4bd94e2402ef66cce88f68925475d02693a3586465542da9d734814453a7b1: Status 404 returned error can't find the container with id ef4bd94e2402ef66cce88f68925475d02693a3586465542da9d734814453a7b1 Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.538637 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" event={"ID":"e33ed079-a0fe-4167-98b1-25339aaf90d2","Type":"ContainerDied","Data":"5cebf946227dfa73efba59d090e2a97b33fca33984567fe85d71ef28f3e01b76"} Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.538688 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-l8pb4" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.538697 4849 scope.go:117] "RemoveContainer" containerID="371b262defdab34bf7aa2f4f1abe9721f0b0f3fca163eace16b6672e2f381e1c" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.544430 4849 generic.go:334] "Generic (PLEG): container finished" podID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerID="2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583" exitCode=0 Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.544475 4849 generic.go:334] "Generic (PLEG): container finished" podID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerID="433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e" exitCode=143 Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.544561 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a27a2635-ce6e-4c86-bd1b-310fe76fba4f","Type":"ContainerDied","Data":"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583"} Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.544600 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a27a2635-ce6e-4c86-bd1b-310fe76fba4f","Type":"ContainerDied","Data":"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e"} Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.544611 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a27a2635-ce6e-4c86-bd1b-310fe76fba4f","Type":"ContainerDied","Data":"e6af235109decc5eb71e24473f2ec1a145275682c2d9860069ec491d455d5338"} Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.544700 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.550977 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-combined-ca-bundle\") pod \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.551100 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmv6x\" (UniqueName: \"kubernetes.io/projected/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-kube-api-access-zmv6x\") pod \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.551195 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-nova-metadata-tls-certs\") pod \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.551251 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-config-data\") pod \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.551347 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-logs\") pod \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\" (UID: \"a27a2635-ce6e-4c86-bd1b-310fe76fba4f\") " Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.552508 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-logs" (OuterVolumeSpecName: "logs") pod "a27a2635-ce6e-4c86-bd1b-310fe76fba4f" (UID: "a27a2635-ce6e-4c86-bd1b-310fe76fba4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.556133 4849 generic.go:334] "Generic (PLEG): container finished" podID="e2474fc3-733d-4664-9282-bda5d96a817d" containerID="3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a" exitCode=143 Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.556250 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2474fc3-733d-4664-9282-bda5d96a817d","Type":"ContainerDied","Data":"3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a"} Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.556775 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-kube-api-access-zmv6x" (OuterVolumeSpecName: "kube-api-access-zmv6x") pod "a27a2635-ce6e-4c86-bd1b-310fe76fba4f" (UID: "a27a2635-ce6e-4c86-bd1b-310fe76fba4f"). InnerVolumeSpecName "kube-api-access-zmv6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.580680 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75","Type":"ContainerStarted","Data":"ef4bd94e2402ef66cce88f68925475d02693a3586465542da9d734814453a7b1"} Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.591537 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-l8pb4"] Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.595425 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a27a2635-ce6e-4c86-bd1b-310fe76fba4f" (UID: "a27a2635-ce6e-4c86-bd1b-310fe76fba4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.597239 4849 scope.go:117] "RemoveContainer" containerID="246ee0bf4da946bf523f8fcf85b311a41ea0836b58f6c64d7b1a24e7b689e3d1" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.597923 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-config-data" (OuterVolumeSpecName: "config-data") pod "a27a2635-ce6e-4c86-bd1b-310fe76fba4f" (UID: "a27a2635-ce6e-4c86-bd1b-310fe76fba4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.599564 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-l8pb4"] Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.634565 4849 scope.go:117] "RemoveContainer" containerID="2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.653851 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.653888 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.653897 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.653929 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmv6x\" (UniqueName: \"kubernetes.io/projected/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-kube-api-access-zmv6x\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.664048 4849 scope.go:117] "RemoveContainer" containerID="433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.690934 4849 scope.go:117] "RemoveContainer" containerID="2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583" Mar 20 13:46:27 crc kubenswrapper[4849]: E0320 13:46:27.693453 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583\": container with ID starting with 2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583 not found: ID does not exist" containerID="2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.693493 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583"} err="failed to get container status \"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583\": rpc error: code = NotFound desc = could not find container \"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583\": container with ID starting with 2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583 not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.693521 4849 scope.go:117] "RemoveContainer" containerID="433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.698151 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a27a2635-ce6e-4c86-bd1b-310fe76fba4f" (UID: "a27a2635-ce6e-4c86-bd1b-310fe76fba4f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:27 crc kubenswrapper[4849]: E0320 13:46:27.699258 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e\": container with ID starting with 433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e not found: ID does not exist" containerID="433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.699291 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e"} err="failed to get container status \"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e\": rpc error: code = NotFound desc = could not find container \"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e\": container with ID starting with 433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.699320 4849 scope.go:117] "RemoveContainer" containerID="2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.699615 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583"} err="failed to get container status \"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583\": rpc error: code = NotFound desc = could not find container \"2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583\": container with ID starting with 2dd198d1d02f038e28d5f33f845e2c7e2123a6b6205b0222feaf77674ddc0583 not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.699647 4849 scope.go:117] "RemoveContainer" containerID="433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.699842 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e"} err="failed to get container status \"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e\": rpc error: code = NotFound desc = could not find container \"433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e\": container with ID starting with 433bfc0a681694c74ecd1577cfbdfa40d76bcc2189d03bb85cb5441a622a374e not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.755641 4849 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27a2635-ce6e-4c86-bd1b-310fe76fba4f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.891018 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.904205 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.914458 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:27 crc kubenswrapper[4849]: E0320 13:46:27.914902 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerName="init" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.914919 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerName="init" Mar 20 13:46:27 crc kubenswrapper[4849]: E0320 13:46:27.914939 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-metadata" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.914945 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-metadata" Mar 20 13:46:27 crc kubenswrapper[4849]: E0320 13:46:27.914958 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerName="dnsmasq-dns" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.914963 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerName="dnsmasq-dns" Mar 20 13:46:27 crc kubenswrapper[4849]: E0320 13:46:27.914985 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-log" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.914990 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-log" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.915202 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-log" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.915213 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" containerName="dnsmasq-dns" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.915224 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" containerName="nova-metadata-metadata" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.916250 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.918490 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.919140 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:46:27 crc kubenswrapper[4849]: I0320 13:46:27.923054 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.063129 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.063244 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-config-data\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.063417 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5bm\" (UniqueName: \"kubernetes.io/projected/07a19c91-f95f-456b-a8dd-52743845e141-kube-api-access-gw5bm\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.063451 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.063475 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a19c91-f95f-456b-a8dd-52743845e141-logs\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.165280 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.165351 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-config-data\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.165455 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5bm\" (UniqueName: \"kubernetes.io/projected/07a19c91-f95f-456b-a8dd-52743845e141-kube-api-access-gw5bm\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.165481 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.165503 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a19c91-f95f-456b-a8dd-52743845e141-logs\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.166378 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a19c91-f95f-456b-a8dd-52743845e141-logs\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.174302 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.179667 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-config-data\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.180293 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.183922 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5bm\" (UniqueName: \"kubernetes.io/projected/07a19c91-f95f-456b-a8dd-52743845e141-kube-api-access-gw5bm\") pod \"nova-metadata-0\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.234200 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.589694 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75","Type":"ContainerStarted","Data":"70ad9e920e27367ce7a6bd45ad3472040562867bf30174b46560eb9b8d9a25f8"} Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.589769 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.593194 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" containerName="nova-scheduler-scheduler" containerID="cri-o://cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" gracePeriod=30 Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.617308 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.6172886159999997 podStartE2EDuration="2.617288616s" podCreationTimestamp="2026-03-20 13:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:28.605108148 +0000 UTC m=+1338.282831543" watchObservedRunningTime="2026-03-20 13:46:28.617288616 +0000 UTC m=+1338.295012011" Mar 20 13:46:28 crc kubenswrapper[4849]: I0320 13:46:28.721272 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:46:29 crc kubenswrapper[4849]: I0320 13:46:29.047238 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27a2635-ce6e-4c86-bd1b-310fe76fba4f" path="/var/lib/kubelet/pods/a27a2635-ce6e-4c86-bd1b-310fe76fba4f/volumes" Mar 20 13:46:29 crc kubenswrapper[4849]: I0320 13:46:29.047916 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e33ed079-a0fe-4167-98b1-25339aaf90d2" path="/var/lib/kubelet/pods/e33ed079-a0fe-4167-98b1-25339aaf90d2/volumes" Mar 20 13:46:29 crc kubenswrapper[4849]: I0320 13:46:29.605017 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07a19c91-f95f-456b-a8dd-52743845e141","Type":"ContainerStarted","Data":"b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d"} Mar 20 13:46:29 crc kubenswrapper[4849]: I0320 13:46:29.605538 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07a19c91-f95f-456b-a8dd-52743845e141","Type":"ContainerStarted","Data":"05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb"} Mar 20 13:46:29 crc kubenswrapper[4849]: I0320 13:46:29.605556 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07a19c91-f95f-456b-a8dd-52743845e141","Type":"ContainerStarted","Data":"db59972bb294284a09fd804b2a49006d8d9a8550d56eb9a0218d0874434d330c"} Mar 20 13:46:29 crc kubenswrapper[4849]: I0320 13:46:29.630522 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.630504934 podStartE2EDuration="2.630504934s" podCreationTimestamp="2026-03-20 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:29.628471811 +0000 UTC m=+1339.306195206" watchObservedRunningTime="2026-03-20 13:46:29.630504934 +0000 UTC m=+1339.308228329" Mar 20 13:46:30 crc kubenswrapper[4849]: E0320 13:46:30.723613 4849 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:46:30 crc kubenswrapper[4849]: E0320 13:46:30.725898 4849 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:46:30 crc kubenswrapper[4849]: E0320 13:46:30.728730 4849 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:46:30 crc kubenswrapper[4849]: E0320 13:46:30.728891 4849 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" containerName="nova-scheduler-scheduler" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.518838 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.569417 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.627587 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whfmb\" (UniqueName: \"kubernetes.io/projected/fcb1dd05-5332-4f88-8d33-b456d59d00d0-kube-api-access-whfmb\") pod \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.627658 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-combined-ca-bundle\") pod \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.627707 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-config-data\") pod \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\" (UID: \"fcb1dd05-5332-4f88-8d33-b456d59d00d0\") " Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.638190 4849 generic.go:334] "Generic (PLEG): container finished" podID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" exitCode=0 Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.638205 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb1dd05-5332-4f88-8d33-b456d59d00d0-kube-api-access-whfmb" (OuterVolumeSpecName: "kube-api-access-whfmb") pod "fcb1dd05-5332-4f88-8d33-b456d59d00d0" (UID: "fcb1dd05-5332-4f88-8d33-b456d59d00d0"). InnerVolumeSpecName "kube-api-access-whfmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.638234 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcb1dd05-5332-4f88-8d33-b456d59d00d0","Type":"ContainerDied","Data":"cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31"} Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.638259 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fcb1dd05-5332-4f88-8d33-b456d59d00d0","Type":"ContainerDied","Data":"05b2b4a97c34ef32ad4162d184b7a6e89f5c4ba523dfe0c60fb83fb899b092e6"} Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.638278 4849 scope.go:117] "RemoveContainer" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.638288 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.656580 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-config-data" (OuterVolumeSpecName: "config-data") pod "fcb1dd05-5332-4f88-8d33-b456d59d00d0" (UID: "fcb1dd05-5332-4f88-8d33-b456d59d00d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.675463 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcb1dd05-5332-4f88-8d33-b456d59d00d0" (UID: "fcb1dd05-5332-4f88-8d33-b456d59d00d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.729322 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.729351 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whfmb\" (UniqueName: \"kubernetes.io/projected/fcb1dd05-5332-4f88-8d33-b456d59d00d0-kube-api-access-whfmb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.729364 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb1dd05-5332-4f88-8d33-b456d59d00d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.741749 4849 scope.go:117] "RemoveContainer" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" Mar 20 13:46:31 crc kubenswrapper[4849]: E0320 13:46:31.742088 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31\": container with ID starting with cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31 not found: ID does not exist" containerID="cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.742125 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31"} err="failed to get container status \"cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31\": rpc error: code = NotFound desc = could not find container \"cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31\": container with ID starting with cb080c64d34627f6044dbb31c0bb72b414635378e067aa1804c2b0a35e6a0b31 not found: ID does not exist" Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.982470 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:31 crc kubenswrapper[4849]: I0320 13:46:31.995708 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.008509 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: E0320 13:46:32.009348 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" containerName="nova-scheduler-scheduler" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.009463 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" containerName="nova-scheduler-scheduler" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.009855 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" containerName="nova-scheduler-scheduler" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.010960 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.013103 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.024363 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.135858 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.135923 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbz6j\" (UniqueName: \"kubernetes.io/projected/80d41c24-8a33-4183-9397-f46556219054-kube-api-access-xbz6j\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.135968 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-config-data\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.238418 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.238501 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbz6j\" (UniqueName: \"kubernetes.io/projected/80d41c24-8a33-4183-9397-f46556219054-kube-api-access-xbz6j\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.238550 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-config-data\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.242234 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-config-data\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.246662 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.253706 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbz6j\" (UniqueName: \"kubernetes.io/projected/80d41c24-8a33-4183-9397-f46556219054-kube-api-access-xbz6j\") pod \"nova-scheduler-0\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.343571 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.472107 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.542481 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2474fc3-733d-4664-9282-bda5d96a817d-logs\") pod \"e2474fc3-733d-4664-9282-bda5d96a817d\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.542617 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57n75\" (UniqueName: \"kubernetes.io/projected/e2474fc3-733d-4664-9282-bda5d96a817d-kube-api-access-57n75\") pod \"e2474fc3-733d-4664-9282-bda5d96a817d\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.542666 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-combined-ca-bundle\") pod \"e2474fc3-733d-4664-9282-bda5d96a817d\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.542721 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-config-data\") pod \"e2474fc3-733d-4664-9282-bda5d96a817d\" (UID: \"e2474fc3-733d-4664-9282-bda5d96a817d\") " Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.543337 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2474fc3-733d-4664-9282-bda5d96a817d-logs" (OuterVolumeSpecName: "logs") pod "e2474fc3-733d-4664-9282-bda5d96a817d" (UID: "e2474fc3-733d-4664-9282-bda5d96a817d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.543790 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2474fc3-733d-4664-9282-bda5d96a817d-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.550909 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2474fc3-733d-4664-9282-bda5d96a817d-kube-api-access-57n75" (OuterVolumeSpecName: "kube-api-access-57n75") pod "e2474fc3-733d-4664-9282-bda5d96a817d" (UID: "e2474fc3-733d-4664-9282-bda5d96a817d"). InnerVolumeSpecName "kube-api-access-57n75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.578107 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-config-data" (OuterVolumeSpecName: "config-data") pod "e2474fc3-733d-4664-9282-bda5d96a817d" (UID: "e2474fc3-733d-4664-9282-bda5d96a817d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.581640 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2474fc3-733d-4664-9282-bda5d96a817d" (UID: "e2474fc3-733d-4664-9282-bda5d96a817d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.655655 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57n75\" (UniqueName: \"kubernetes.io/projected/e2474fc3-733d-4664-9282-bda5d96a817d-kube-api-access-57n75\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.655712 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.655758 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2474fc3-733d-4664-9282-bda5d96a817d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.668325 4849 generic.go:334] "Generic (PLEG): container finished" podID="e2474fc3-733d-4664-9282-bda5d96a817d" containerID="915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a" exitCode=0 Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.668510 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.668425 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2474fc3-733d-4664-9282-bda5d96a817d","Type":"ContainerDied","Data":"915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a"} Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.669070 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2474fc3-733d-4664-9282-bda5d96a817d","Type":"ContainerDied","Data":"b9e75e3b543052eb83935351009479896568b039712f4c206de88c9730f98d48"} Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.669127 4849 scope.go:117] "RemoveContainer" containerID="915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.697747 4849 scope.go:117] "RemoveContainer" containerID="3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.705929 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.709845 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.722909 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: E0320 13:46:32.723359 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-log" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.723380 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-log" Mar 20 13:46:32 crc kubenswrapper[4849]: E0320 13:46:32.723408 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-api" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.723416 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-api" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.723639 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-log" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.723668 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" containerName="nova-api-api" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.724943 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.727151 4849 scope.go:117] "RemoveContainer" containerID="915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a" Mar 20 13:46:32 crc kubenswrapper[4849]: E0320 13:46:32.727663 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a\": container with ID starting with 915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a not found: ID does not exist" containerID="915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.727696 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a"} err="failed to get container status \"915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a\": rpc error: code = NotFound desc = could not find container \"915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a\": container with ID starting with 915338cf1bda9eb0fc5fdeb629061aa104ed3ef815145771837977ba150c6a4a not found: ID does not exist" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.727720 4849 scope.go:117] "RemoveContainer" containerID="3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.727887 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:46:32 crc kubenswrapper[4849]: E0320 13:46:32.737368 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a\": container with ID starting with 3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a not found: ID does not exist" containerID="3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.737410 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a"} err="failed to get container status \"3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a\": rpc error: code = NotFound desc = could not find container \"3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a\": container with ID starting with 3535fd17939d90dac2e6c9b03d96a557cfe2bc47fcc4e67ad48901f68058285a not found: ID does not exist" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.739469 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.815285 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:46:32 crc kubenswrapper[4849]: W0320 13:46:32.819681 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d41c24_8a33_4183_9397_f46556219054.slice/crio-43f7524f23decfa3241edfa115417c39f277e21eaeae7d1836bc982dd68818fc WatchSource:0}: Error finding container 43f7524f23decfa3241edfa115417c39f277e21eaeae7d1836bc982dd68818fc: Status 404 returned error can't find the container with id 43f7524f23decfa3241edfa115417c39f277e21eaeae7d1836bc982dd68818fc Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.861465 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-config-data\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.861536 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f300511-17b3-4361-8694-3ba7b2d125ed-logs\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.861712 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttt4\" (UniqueName: \"kubernetes.io/projected/8f300511-17b3-4361-8694-3ba7b2d125ed-kube-api-access-kttt4\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.861868 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.964158 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.964224 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-config-data\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.964251 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f300511-17b3-4361-8694-3ba7b2d125ed-logs\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.964334 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kttt4\" (UniqueName: \"kubernetes.io/projected/8f300511-17b3-4361-8694-3ba7b2d125ed-kube-api-access-kttt4\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.965019 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f300511-17b3-4361-8694-3ba7b2d125ed-logs\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.968848 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-config-data\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.969021 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:32 crc kubenswrapper[4849]: I0320 13:46:32.981504 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttt4\" (UniqueName: \"kubernetes.io/projected/8f300511-17b3-4361-8694-3ba7b2d125ed-kube-api-access-kttt4\") pod \"nova-api-0\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " pod="openstack/nova-api-0" Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.048640 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2474fc3-733d-4664-9282-bda5d96a817d" path="/var/lib/kubelet/pods/e2474fc3-733d-4664-9282-bda5d96a817d/volumes" Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.048734 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.049648 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb1dd05-5332-4f88-8d33-b456d59d00d0" path="/var/lib/kubelet/pods/fcb1dd05-5332-4f88-8d33-b456d59d00d0/volumes" Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.502201 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.683576 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d41c24-8a33-4183-9397-f46556219054","Type":"ContainerStarted","Data":"9f330ca93994e4ddab6a0b638127fa00003b61328412a4deeb92d34683743289"} Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.684011 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d41c24-8a33-4183-9397-f46556219054","Type":"ContainerStarted","Data":"43f7524f23decfa3241edfa115417c39f277e21eaeae7d1836bc982dd68818fc"} Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.688032 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8f300511-17b3-4361-8694-3ba7b2d125ed","Type":"ContainerStarted","Data":"f5fb8de51205d7246358d1298c6fad2c929a7f7676cf69443fa47f2a5ee0f3f9"} Mar 20 13:46:33 crc kubenswrapper[4849]: I0320 13:46:33.704898 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.704875978 podStartE2EDuration="2.704875978s" podCreationTimestamp="2026-03-20 13:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:33.698153243 +0000 UTC m=+1343.375876648" watchObservedRunningTime="2026-03-20 13:46:33.704875978 +0000 UTC m=+1343.382599393" Mar 20 13:46:34 crc kubenswrapper[4849]: I0320 13:46:34.722175 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8f300511-17b3-4361-8694-3ba7b2d125ed","Type":"ContainerStarted","Data":"c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787"} Mar 20 13:46:34 crc kubenswrapper[4849]: I0320 13:46:34.722233 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8f300511-17b3-4361-8694-3ba7b2d125ed","Type":"ContainerStarted","Data":"a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b"} Mar 20 13:46:34 crc kubenswrapper[4849]: I0320 13:46:34.754692 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.754673809 podStartE2EDuration="2.754673809s" podCreationTimestamp="2026-03-20 13:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:34.754238397 +0000 UTC m=+1344.431961792" watchObservedRunningTime="2026-03-20 13:46:34.754673809 +0000 UTC m=+1344.432397204" Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.330406 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.330891 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" containerName="kube-state-metrics" containerID="cri-o://41a65b89ba16e1797ddad54421f40dcfa1aa9bcc02eb9ea68e86610033152b85" gracePeriod=30 Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.735046 4849 generic.go:334] "Generic (PLEG): container finished" podID="e53df741-614d-449c-8da6-4de0333a6e9b" containerID="41a65b89ba16e1797ddad54421f40dcfa1aa9bcc02eb9ea68e86610033152b85" exitCode=2 Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.735156 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e53df741-614d-449c-8da6-4de0333a6e9b","Type":"ContainerDied","Data":"41a65b89ba16e1797ddad54421f40dcfa1aa9bcc02eb9ea68e86610033152b85"} Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.735594 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e53df741-614d-449c-8da6-4de0333a6e9b","Type":"ContainerDied","Data":"4db35a6293e0d0e35cffdf5e0ecf13489fb7744802dfa482dda428a1981b447c"} Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.735615 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db35a6293e0d0e35cffdf5e0ecf13489fb7744802dfa482dda428a1981b447c" Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.815868 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.922889 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/e53df741-614d-449c-8da6-4de0333a6e9b-kube-api-access-hqpqr\") pod \"e53df741-614d-449c-8da6-4de0333a6e9b\" (UID: \"e53df741-614d-449c-8da6-4de0333a6e9b\") " Mar 20 13:46:35 crc kubenswrapper[4849]: I0320 13:46:35.930629 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53df741-614d-449c-8da6-4de0333a6e9b-kube-api-access-hqpqr" (OuterVolumeSpecName: "kube-api-access-hqpqr") pod "e53df741-614d-449c-8da6-4de0333a6e9b" (UID: "e53df741-614d-449c-8da6-4de0333a6e9b"). InnerVolumeSpecName "kube-api-access-hqpqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.024641 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqpqr\" (UniqueName: \"kubernetes.io/projected/e53df741-614d-449c-8da6-4de0333a6e9b-kube-api-access-hqpqr\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.743036 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.783633 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.797602 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.807205 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:36 crc kubenswrapper[4849]: E0320 13:46:36.807759 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" containerName="kube-state-metrics" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.807787 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" containerName="kube-state-metrics" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.808152 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" containerName="kube-state-metrics" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.809214 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.816846 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.822651 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.822671 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.940113 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.940489 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-central-agent" containerID="cri-o://a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.941139 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="proxy-httpd" containerID="cri-o://12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.941204 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="sg-core" containerID="cri-o://abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.941253 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-notification-agent" containerID="cri-o://b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.945235 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.945343 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.945377 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tprs2\" (UniqueName: \"kubernetes.io/projected/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-api-access-tprs2\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.945437 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:36 crc kubenswrapper[4849]: I0320 13:46:36.954751 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.047534 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53df741-614d-449c-8da6-4de0333a6e9b" path="/var/lib/kubelet/pods/e53df741-614d-449c-8da6-4de0333a6e9b/volumes" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.047959 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.049165 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.049260 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tprs2\" (UniqueName: \"kubernetes.io/projected/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-api-access-tprs2\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.049330 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.052688 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.053462 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.054521 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.073621 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tprs2\" (UniqueName: \"kubernetes.io/projected/a36f6ced-ab2b-44a3-b7d6-c4744d7f959d-kube-api-access-tprs2\") pod \"kube-state-metrics-0\" (UID: \"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.171649 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.344728 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.664176 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.752134 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d","Type":"ContainerStarted","Data":"b57eed07a91426f6d5b36c99e299a6ebf496ebf1d8c5b402bd7371b4e3d2e8bb"} Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.756608 4849 generic.go:334] "Generic (PLEG): container finished" podID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerID="12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1" exitCode=0 Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.756643 4849 generic.go:334] "Generic (PLEG): container finished" podID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerID="abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c" exitCode=2 Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.756655 4849 generic.go:334] "Generic (PLEG): container finished" podID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerID="a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e" exitCode=0 Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.756665 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerDied","Data":"12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1"} Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.756689 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerDied","Data":"abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c"} Mar 20 13:46:37 crc kubenswrapper[4849]: I0320 13:46:37.756700 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerDied","Data":"a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e"} Mar 20 13:46:38 crc kubenswrapper[4849]: I0320 13:46:38.235226 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:46:38 crc kubenswrapper[4849]: I0320 13:46:38.235274 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:46:38 crc kubenswrapper[4849]: I0320 13:46:38.766728 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a36f6ced-ab2b-44a3-b7d6-c4744d7f959d","Type":"ContainerStarted","Data":"3ef586bf499bb1fa44e8e5d135fb1f489382ac9d8f91e11934c6063903d72cf8"} Mar 20 13:46:38 crc kubenswrapper[4849]: I0320 13:46:38.767085 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:46:38 crc kubenswrapper[4849]: I0320 13:46:38.789227 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.37835424 podStartE2EDuration="2.789204535s" podCreationTimestamp="2026-03-20 13:46:36 +0000 UTC" firstStartedPulling="2026-03-20 13:46:37.677437548 +0000 UTC m=+1347.355160943" lastFinishedPulling="2026-03-20 13:46:38.088287843 +0000 UTC m=+1347.766011238" observedRunningTime="2026-03-20 13:46:38.780273682 +0000 UTC m=+1348.457997067" watchObservedRunningTime="2026-03-20 13:46:38.789204535 +0000 UTC m=+1348.466927930" Mar 20 13:46:39 crc kubenswrapper[4849]: I0320 13:46:39.247949 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:39 crc kubenswrapper[4849]: I0320 13:46:39.247969 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:39 crc kubenswrapper[4849]: I0320 13:46:39.384098 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:46:39 crc kubenswrapper[4849]: I0320 13:46:39.384176 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.650947 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724130 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-scripts\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724176 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-sg-core-conf-yaml\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724314 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-run-httpd\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724340 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-config-data\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724409 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh8zn\" (UniqueName: \"kubernetes.io/projected/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-kube-api-access-fh8zn\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724458 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-log-httpd\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.724501 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-combined-ca-bundle\") pod \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\" (UID: \"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0\") " Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.725356 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.725401 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.725902 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.725931 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.732965 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-kube-api-access-fh8zn" (OuterVolumeSpecName: "kube-api-access-fh8zn") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "kube-api-access-fh8zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.733047 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-scripts" (OuterVolumeSpecName: "scripts") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.770556 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.804950 4849 generic.go:334] "Generic (PLEG): container finished" podID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerID="b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9" exitCode=0 Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.804993 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerDied","Data":"b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9"} Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.805024 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0","Type":"ContainerDied","Data":"eb02db2a92e824f07f6d82de8c16025882cdeab2a43342ee6de18c1337de2c05"} Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.805042 4849 scope.go:117] "RemoveContainer" containerID="12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.805396 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.827748 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.827777 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.827788 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh8zn\" (UniqueName: \"kubernetes.io/projected/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-kube-api-access-fh8zn\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.836338 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-config-data" (OuterVolumeSpecName: "config-data") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.839791 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" (UID: "ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.840256 4849 scope.go:117] "RemoveContainer" containerID="abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.865718 4849 scope.go:117] "RemoveContainer" containerID="b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.905879 4849 scope.go:117] "RemoveContainer" containerID="a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.930049 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.930078 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.931156 4849 scope.go:117] "RemoveContainer" containerID="12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1" Mar 20 13:46:40 crc kubenswrapper[4849]: E0320 13:46:40.931644 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1\": container with ID starting with 12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1 not found: ID does not exist" containerID="12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.931673 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1"} err="failed to get container status \"12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1\": rpc error: code = NotFound desc = could not find container \"12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1\": container with ID starting with 12d9dd58c863717b62ce2256dce51d9ae93eb35489ef4ab4ae8a2c0a0ffa8ee1 not found: ID does not exist" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.931692 4849 scope.go:117] "RemoveContainer" containerID="abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c" Mar 20 13:46:40 crc kubenswrapper[4849]: E0320 13:46:40.932089 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c\": container with ID starting with abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c not found: ID does not exist" containerID="abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.932126 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c"} err="failed to get container status \"abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c\": rpc error: code = NotFound desc = could not find container \"abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c\": container with ID starting with abb9787c4307320fb3a6b71b0d08d0f7cb7f7d9c06ba98de6f0f2887d689b99c not found: ID does not exist" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.932152 4849 scope.go:117] "RemoveContainer" containerID="b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9" Mar 20 13:46:40 crc kubenswrapper[4849]: E0320 13:46:40.932387 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9\": container with ID starting with b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9 not found: ID does not exist" containerID="b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.932418 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9"} err="failed to get container status \"b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9\": rpc error: code = NotFound desc = could not find container \"b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9\": container with ID starting with b1757d17a544a9038f05588924db1cbe9297c0627c23530d3bf8d896311b83b9 not found: ID does not exist" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.932433 4849 scope.go:117] "RemoveContainer" containerID="a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e" Mar 20 13:46:40 crc kubenswrapper[4849]: E0320 13:46:40.932783 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e\": container with ID starting with a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e not found: ID does not exist" containerID="a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e" Mar 20 13:46:40 crc kubenswrapper[4849]: I0320 13:46:40.932804 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e"} err="failed to get container status \"a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e\": rpc error: code = NotFound desc = could not find container \"a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e\": container with ID starting with a91ec885780886bc0be39ffa8248fcff2f3fd3dc46fb40752bc118fe1da0781e not found: ID does not exist" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.129241 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.139986 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164055 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:41 crc kubenswrapper[4849]: E0320 13:46:41.164443 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-central-agent" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164456 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-central-agent" Mar 20 13:46:41 crc kubenswrapper[4849]: E0320 13:46:41.164471 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="proxy-httpd" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164479 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="proxy-httpd" Mar 20 13:46:41 crc kubenswrapper[4849]: E0320 13:46:41.164495 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="sg-core" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164503 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="sg-core" Mar 20 13:46:41 crc kubenswrapper[4849]: E0320 13:46:41.164534 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-notification-agent" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164540 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-notification-agent" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164714 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="proxy-httpd" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164742 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="sg-core" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164754 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-central-agent" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.164764 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" containerName="ceilometer-notification-agent" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.166341 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.168420 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.168687 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.168691 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.186433 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.339551 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-scripts\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.339632 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.339798 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-log-httpd\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.339902 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-config-data\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.339953 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-run-httpd\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.340105 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.340162 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szcg7\" (UniqueName: \"kubernetes.io/projected/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-kube-api-access-szcg7\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.340237 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442109 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442206 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-log-httpd\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442249 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-config-data\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442276 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-run-httpd\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442324 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442356 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szcg7\" (UniqueName: \"kubernetes.io/projected/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-kube-api-access-szcg7\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442385 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.442460 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-scripts\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.443756 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-run-httpd\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.443935 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-log-httpd\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.446682 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.446876 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.447079 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.446702 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-scripts\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.447840 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-config-data\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.461085 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szcg7\" (UniqueName: \"kubernetes.io/projected/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-kube-api-access-szcg7\") pod \"ceilometer-0\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.478159 4849 scope.go:117] "RemoveContainer" containerID="b04e87520cc6d45e6d96f9d5d18f2768dbc9367da2e4fa7e0850e700f0134cda" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.489356 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:41 crc kubenswrapper[4849]: I0320 13:46:41.963495 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:41 crc kubenswrapper[4849]: W0320 13:46:41.969587 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee472bc4_f40a_41d9_b1e7_651f9c39eb27.slice/crio-95c8147305f4025fbc302cbe021c637991874ac19ec8a7038df1b18515dceff9 WatchSource:0}: Error finding container 95c8147305f4025fbc302cbe021c637991874ac19ec8a7038df1b18515dceff9: Status 404 returned error can't find the container with id 95c8147305f4025fbc302cbe021c637991874ac19ec8a7038df1b18515dceff9 Mar 20 13:46:42 crc kubenswrapper[4849]: I0320 13:46:42.343983 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:46:42 crc kubenswrapper[4849]: I0320 13:46:42.379306 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:46:42 crc kubenswrapper[4849]: I0320 13:46:42.824155 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerStarted","Data":"95c8147305f4025fbc302cbe021c637991874ac19ec8a7038df1b18515dceff9"} Mar 20 13:46:42 crc kubenswrapper[4849]: I0320 13:46:42.848660 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:46:43 crc kubenswrapper[4849]: I0320 13:46:43.045687 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0" path="/var/lib/kubelet/pods/ceb2ed80-2d5c-41ec-9359-17e85ed9e5f0/volumes" Mar 20 13:46:43 crc kubenswrapper[4849]: I0320 13:46:43.049417 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:46:43 crc kubenswrapper[4849]: I0320 13:46:43.049723 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:46:43 crc kubenswrapper[4849]: I0320 13:46:43.836960 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerStarted","Data":"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56"} Mar 20 13:46:44 crc kubenswrapper[4849]: I0320 13:46:44.139925 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:44 crc kubenswrapper[4849]: I0320 13:46:44.140702 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:44 crc kubenswrapper[4849]: I0320 13:46:44.849270 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerStarted","Data":"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b"} Mar 20 13:46:45 crc kubenswrapper[4849]: I0320 13:46:45.860061 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerStarted","Data":"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42"} Mar 20 13:46:46 crc kubenswrapper[4849]: I0320 13:46:46.234864 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:46:46 crc kubenswrapper[4849]: I0320 13:46:46.234918 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:46:47 crc kubenswrapper[4849]: I0320 13:46:47.187040 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:46:47 crc kubenswrapper[4849]: I0320 13:46:47.880548 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerStarted","Data":"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178"} Mar 20 13:46:47 crc kubenswrapper[4849]: I0320 13:46:47.881861 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:48 crc kubenswrapper[4849]: I0320 13:46:48.238955 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:46:48 crc kubenswrapper[4849]: I0320 13:46:48.242549 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:46:48 crc kubenswrapper[4849]: I0320 13:46:48.244128 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:46:48 crc kubenswrapper[4849]: I0320 13:46:48.268572 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.539263963 podStartE2EDuration="7.268553291s" podCreationTimestamp="2026-03-20 13:46:41 +0000 UTC" firstStartedPulling="2026-03-20 13:46:41.972774849 +0000 UTC m=+1351.650498244" lastFinishedPulling="2026-03-20 13:46:46.702064177 +0000 UTC m=+1356.379787572" observedRunningTime="2026-03-20 13:46:47.906881777 +0000 UTC m=+1357.584605172" watchObservedRunningTime="2026-03-20 13:46:48.268553291 +0000 UTC m=+1357.946276686" Mar 20 13:46:48 crc kubenswrapper[4849]: I0320 13:46:48.898935 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.792679 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.907004 4849 generic.go:334] "Generic (PLEG): container finished" podID="cfb1017b-3277-4667-aade-d2852d0ddd0e" containerID="9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8" exitCode=137 Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.907973 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.908445 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfb1017b-3277-4667-aade-d2852d0ddd0e","Type":"ContainerDied","Data":"9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8"} Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.908475 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfb1017b-3277-4667-aade-d2852d0ddd0e","Type":"ContainerDied","Data":"dc928e04393f71357e364735b2de3e3d45a91464260d00da9e7ed16df920a836"} Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.908493 4849 scope.go:117] "RemoveContainer" containerID="9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.930307 4849 scope.go:117] "RemoveContainer" containerID="9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8" Mar 20 13:46:50 crc kubenswrapper[4849]: E0320 13:46:50.930620 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8\": container with ID starting with 9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8 not found: ID does not exist" containerID="9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.930656 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8"} err="failed to get container status \"9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8\": rpc error: code = NotFound desc = could not find container \"9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8\": container with ID starting with 9fd39a63574c6bdc941914a6eb0829a7d8cf4d78746f1e730324ff49b1e39fb8 not found: ID does not exist" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.932410 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-config-data\") pod \"cfb1017b-3277-4667-aade-d2852d0ddd0e\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.932460 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntfj4\" (UniqueName: \"kubernetes.io/projected/cfb1017b-3277-4667-aade-d2852d0ddd0e-kube-api-access-ntfj4\") pod \"cfb1017b-3277-4667-aade-d2852d0ddd0e\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.932622 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-combined-ca-bundle\") pod \"cfb1017b-3277-4667-aade-d2852d0ddd0e\" (UID: \"cfb1017b-3277-4667-aade-d2852d0ddd0e\") " Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.938048 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb1017b-3277-4667-aade-d2852d0ddd0e-kube-api-access-ntfj4" (OuterVolumeSpecName: "kube-api-access-ntfj4") pod "cfb1017b-3277-4667-aade-d2852d0ddd0e" (UID: "cfb1017b-3277-4667-aade-d2852d0ddd0e"). InnerVolumeSpecName "kube-api-access-ntfj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.958784 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-config-data" (OuterVolumeSpecName: "config-data") pod "cfb1017b-3277-4667-aade-d2852d0ddd0e" (UID: "cfb1017b-3277-4667-aade-d2852d0ddd0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:50 crc kubenswrapper[4849]: I0320 13:46:50.959309 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb1017b-3277-4667-aade-d2852d0ddd0e" (UID: "cfb1017b-3277-4667-aade-d2852d0ddd0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.036878 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.036911 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntfj4\" (UniqueName: \"kubernetes.io/projected/cfb1017b-3277-4667-aade-d2852d0ddd0e-kube-api-access-ntfj4\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.036926 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb1017b-3277-4667-aade-d2852d0ddd0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.049295 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.049340 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.238247 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.248751 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.256861 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:51 crc kubenswrapper[4849]: E0320 13:46:51.257322 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb1017b-3277-4667-aade-d2852d0ddd0e" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.257342 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb1017b-3277-4667-aade-d2852d0ddd0e" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.257542 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb1017b-3277-4667-aade-d2852d0ddd0e" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.258204 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.260755 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.261560 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.261705 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.265378 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.342139 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.342202 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.342280 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.342300 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fxv7\" (UniqueName: \"kubernetes.io/projected/957dbf28-7479-44c7-96ed-787f99da4249-kube-api-access-5fxv7\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.342332 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.444540 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.444605 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.444631 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fxv7\" (UniqueName: \"kubernetes.io/projected/957dbf28-7479-44c7-96ed-787f99da4249-kube-api-access-5fxv7\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.444668 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.444758 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.448901 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.449041 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.449464 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.451567 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/957dbf28-7479-44c7-96ed-787f99da4249-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.462006 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fxv7\" (UniqueName: \"kubernetes.io/projected/957dbf28-7479-44c7-96ed-787f99da4249-kube-api-access-5fxv7\") pod \"nova-cell1-novncproxy-0\" (UID: \"957dbf28-7479-44c7-96ed-787f99da4249\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:51 crc kubenswrapper[4849]: I0320 13:46:51.573068 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:52 crc kubenswrapper[4849]: I0320 13:46:52.063398 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:46:52 crc kubenswrapper[4849]: I0320 13:46:52.927788 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"957dbf28-7479-44c7-96ed-787f99da4249","Type":"ContainerStarted","Data":"116b53744fe5787d2a4eb73c3edae617b03828dd24948cb023bcd70bc8803829"} Mar 20 13:46:52 crc kubenswrapper[4849]: I0320 13:46:52.928113 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"957dbf28-7479-44c7-96ed-787f99da4249","Type":"ContainerStarted","Data":"22de0947806a3e34977e1d317edbf3d48d3e7745a903837a83d390bfe4b5d563"} Mar 20 13:46:53 crc kubenswrapper[4849]: I0320 13:46:53.045414 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb1017b-3277-4667-aade-d2852d0ddd0e" path="/var/lib/kubelet/pods/cfb1017b-3277-4667-aade-d2852d0ddd0e/volumes" Mar 20 13:46:53 crc kubenswrapper[4849]: I0320 13:46:53.052617 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:46:53 crc kubenswrapper[4849]: I0320 13:46:53.058759 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:46:53 crc kubenswrapper[4849]: I0320 13:46:53.059793 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:46:53 crc kubenswrapper[4849]: I0320 13:46:53.080805 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.080787998 podStartE2EDuration="2.080787998s" podCreationTimestamp="2026-03-20 13:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:52.947442434 +0000 UTC m=+1362.625165869" watchObservedRunningTime="2026-03-20 13:46:53.080787998 +0000 UTC m=+1362.758511393" Mar 20 13:46:53 crc kubenswrapper[4849]: I0320 13:46:53.941396 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.119011 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ns6mb"] Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.120577 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.135071 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ns6mb"] Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.204705 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.204760 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-662wl\" (UniqueName: \"kubernetes.io/projected/80896962-f9f0-4207-a772-be4cf354e8e6-kube-api-access-662wl\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.204807 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.204868 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-config\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.204910 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.204958 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.306973 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.307061 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-config\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.307115 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.307183 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.307223 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.307258 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-662wl\" (UniqueName: \"kubernetes.io/projected/80896962-f9f0-4207-a772-be4cf354e8e6-kube-api-access-662wl\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.307874 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.308269 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-config\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.308303 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.308354 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.308740 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/80896962-f9f0-4207-a772-be4cf354e8e6-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.330642 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-662wl\" (UniqueName: \"kubernetes.io/projected/80896962-f9f0-4207-a772-be4cf354e8e6-kube-api-access-662wl\") pod \"dnsmasq-dns-89c5cd4d5-ns6mb\" (UID: \"80896962-f9f0-4207-a772-be4cf354e8e6\") " pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.460926 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:54 crc kubenswrapper[4849]: I0320 13:46:54.976746 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-ns6mb"] Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.822273 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.822814 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-central-agent" containerID="cri-o://f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" gracePeriod=30 Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.822880 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="proxy-httpd" containerID="cri-o://d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" gracePeriod=30 Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.822971 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-notification-agent" containerID="cri-o://3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" gracePeriod=30 Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.822960 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="sg-core" containerID="cri-o://03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" gracePeriod=30 Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.954455 4849 generic.go:334] "Generic (PLEG): container finished" podID="80896962-f9f0-4207-a772-be4cf354e8e6" containerID="4f412354af8ec7574e5359eb7b83ef321010bab7d123d36d0c9537f9980e2408" exitCode=0 Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.954915 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" event={"ID":"80896962-f9f0-4207-a772-be4cf354e8e6","Type":"ContainerDied","Data":"4f412354af8ec7574e5359eb7b83ef321010bab7d123d36d0c9537f9980e2408"} Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.954987 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" event={"ID":"80896962-f9f0-4207-a772-be4cf354e8e6","Type":"ContainerStarted","Data":"45b67db77fc206cc8c6449e088837b100ed3a93e4bf780c53a998dcf7917d503"} Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.958151 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerID="03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" exitCode=2 Mar 20 13:46:55 crc kubenswrapper[4849]: I0320 13:46:55.958209 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerDied","Data":"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42"} Mar 20 13:46:56 crc kubenswrapper[4849]: I0320 13:46:56.163150 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.576261 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.826169 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960069 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-combined-ca-bundle\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960137 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-config-data\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960188 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-scripts\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960233 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-sg-core-conf-yaml\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960294 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-ceilometer-tls-certs\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960359 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szcg7\" (UniqueName: \"kubernetes.io/projected/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-kube-api-access-szcg7\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960396 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-run-httpd\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.960423 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-log-httpd\") pod \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\" (UID: \"ee472bc4-f40a-41d9-b1e7-651f9c39eb27\") " Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.961367 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.962543 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.966026 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-kube-api-access-szcg7" (OuterVolumeSpecName: "kube-api-access-szcg7") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "kube-api-access-szcg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.966695 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-scripts" (OuterVolumeSpecName: "scripts") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.974940 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerID="d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" exitCode=0 Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.974970 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerID="3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" exitCode=0 Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.974979 4849 generic.go:334] "Generic (PLEG): container finished" podID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerID="f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" exitCode=0 Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.974992 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.975024 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerDied","Data":"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178"} Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.975078 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerDied","Data":"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b"} Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.975095 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerDied","Data":"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56"} Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.975108 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee472bc4-f40a-41d9-b1e7-651f9c39eb27","Type":"ContainerDied","Data":"95c8147305f4025fbc302cbe021c637991874ac19ec8a7038df1b18515dceff9"} Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.975129 4849 scope.go:117] "RemoveContainer" containerID="d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.979211 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" event={"ID":"80896962-f9f0-4207-a772-be4cf354e8e6","Type":"ContainerStarted","Data":"8c0d24058b73e47a65222949d8dee4781ad25d803f02f130d031de21c448113c"} Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.979279 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-log" containerID="cri-o://a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:56.979414 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-api" containerID="cri-o://c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.005720 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" podStartSLOduration=3.005696878 podStartE2EDuration="3.005696878s" podCreationTimestamp="2026-03-20 13:46:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:56.999774124 +0000 UTC m=+1366.677497539" watchObservedRunningTime="2026-03-20 13:46:57.005696878 +0000 UTC m=+1366.683420273" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.030216 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.048987 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.060931 4849 scope.go:117] "RemoveContainer" containerID="03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.062270 4849 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.062298 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szcg7\" (UniqueName: \"kubernetes.io/projected/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-kube-api-access-szcg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.062310 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.062318 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.062326 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.062334 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.074093 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.096599 4849 scope.go:117] "RemoveContainer" containerID="3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.100788 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-config-data" (OuterVolumeSpecName: "config-data") pod "ee472bc4-f40a-41d9-b1e7-651f9c39eb27" (UID: "ee472bc4-f40a-41d9-b1e7-651f9c39eb27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.131689 4849 scope.go:117] "RemoveContainer" containerID="f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.154295 4849 scope.go:117] "RemoveContainer" containerID="d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.154774 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": container with ID starting with d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178 not found: ID does not exist" containerID="d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.154812 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178"} err="failed to get container status \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": rpc error: code = NotFound desc = could not find container \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": container with ID starting with d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.154854 4849 scope.go:117] "RemoveContainer" containerID="03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.155471 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": container with ID starting with 03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42 not found: ID does not exist" containerID="03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.155512 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42"} err="failed to get container status \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": rpc error: code = NotFound desc = could not find container \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": container with ID starting with 03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.155539 4849 scope.go:117] "RemoveContainer" containerID="3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.155951 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": container with ID starting with 3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b not found: ID does not exist" containerID="3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.155984 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b"} err="failed to get container status \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": rpc error: code = NotFound desc = could not find container \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": container with ID starting with 3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.156001 4849 scope.go:117] "RemoveContainer" containerID="f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.158345 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": container with ID starting with f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56 not found: ID does not exist" containerID="f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.158377 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56"} err="failed to get container status \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": rpc error: code = NotFound desc = could not find container \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": container with ID starting with f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.158394 4849 scope.go:117] "RemoveContainer" containerID="d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.158624 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178"} err="failed to get container status \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": rpc error: code = NotFound desc = could not find container \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": container with ID starting with d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.158651 4849 scope.go:117] "RemoveContainer" containerID="03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.158889 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42"} err="failed to get container status \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": rpc error: code = NotFound desc = could not find container \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": container with ID starting with 03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.158917 4849 scope.go:117] "RemoveContainer" containerID="3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.160106 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b"} err="failed to get container status \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": rpc error: code = NotFound desc = could not find container \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": container with ID starting with 3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.160132 4849 scope.go:117] "RemoveContainer" containerID="f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.160541 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56"} err="failed to get container status \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": rpc error: code = NotFound desc = could not find container \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": container with ID starting with f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.160570 4849 scope.go:117] "RemoveContainer" containerID="d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.160800 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178"} err="failed to get container status \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": rpc error: code = NotFound desc = could not find container \"d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178\": container with ID starting with d1ee732e112172e7126fd2e26b636ae626232269171bb21b36941267a99ce178 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.160841 4849 scope.go:117] "RemoveContainer" containerID="03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.161128 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42"} err="failed to get container status \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": rpc error: code = NotFound desc = could not find container \"03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42\": container with ID starting with 03fb50598fad43a479115d94e76c44fe630988a541f04292f56e250a40fb2e42 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.161150 4849 scope.go:117] "RemoveContainer" containerID="3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.161417 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b"} err="failed to get container status \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": rpc error: code = NotFound desc = could not find container \"3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b\": container with ID starting with 3fdba09839a7e489e353aac3d61546d23673fee273b4a6e06897bc5d87d58e6b not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.161443 4849 scope.go:117] "RemoveContainer" containerID="f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.161706 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56"} err="failed to get container status \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": rpc error: code = NotFound desc = could not find container \"f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56\": container with ID starting with f5b502fe592bc4a56ee52fc7be168cc68720ed89c34d999e0b6407669ab95a56 not found: ID does not exist" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.164216 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.164236 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee472bc4-f40a-41d9-b1e7-651f9c39eb27-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.309670 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.319472 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335169 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.335582 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="proxy-httpd" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335606 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="proxy-httpd" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.335633 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-central-agent" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335640 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-central-agent" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.335653 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-notification-agent" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335661 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-notification-agent" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.335676 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="sg-core" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335684 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="sg-core" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335900 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-notification-agent" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335913 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="sg-core" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335926 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="ceilometer-central-agent" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.335945 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" containerName="proxy-httpd" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.337601 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.339712 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.340385 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.341052 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.352451 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.474852 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-scripts\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.474912 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.474980 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-log-httpd\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.475066 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.475128 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-run-httpd\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.475175 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-config-data\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.475269 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzss\" (UniqueName: \"kubernetes.io/projected/e7e15095-34a2-4ab7-a578-e77290116b58-kube-api-access-hqzss\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.475294 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.577436 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578030 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-run-httpd\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578085 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-config-data\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578160 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzss\" (UniqueName: \"kubernetes.io/projected/e7e15095-34a2-4ab7-a578-e77290116b58-kube-api-access-hqzss\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578188 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-run-httpd\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578188 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578289 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-scripts\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578318 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578375 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-log-httpd\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.578786 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-log-httpd\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.585881 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.587063 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-scripts\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.594909 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-config-data\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.598636 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.604196 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.604553 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzss\" (UniqueName: \"kubernetes.io/projected/e7e15095-34a2-4ab7-a578-e77290116b58-kube-api-access-hqzss\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: E0320 13:46:57.605556 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs kube-api-access-hqzss], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="e7e15095-34a2-4ab7-a578-e77290116b58" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.606777 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.999641 4849 generic.go:334] "Generic (PLEG): container finished" podID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerID="a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b" exitCode=143 Mar 20 13:46:57 crc kubenswrapper[4849]: I0320 13:46:57.999730 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8f300511-17b3-4361-8694-3ba7b2d125ed","Type":"ContainerDied","Data":"a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b"} Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.002785 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.003013 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.015258 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092020 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-config-data\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092119 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-sg-core-conf-yaml\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092159 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqzss\" (UniqueName: \"kubernetes.io/projected/e7e15095-34a2-4ab7-a578-e77290116b58-kube-api-access-hqzss\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092328 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-scripts\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092377 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-run-httpd\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092407 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-ceilometer-tls-certs\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092429 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-combined-ca-bundle\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092841 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-log-httpd\") pod \"e7e15095-34a2-4ab7-a578-e77290116b58\" (UID: \"e7e15095-34a2-4ab7-a578-e77290116b58\") " Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.092680 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.093154 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.093685 4849 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.093710 4849 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e7e15095-34a2-4ab7-a578-e77290116b58-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.097910 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.097961 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-config-data" (OuterVolumeSpecName: "config-data") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.097985 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e15095-34a2-4ab7-a578-e77290116b58-kube-api-access-hqzss" (OuterVolumeSpecName: "kube-api-access-hqzss") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "kube-api-access-hqzss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.098606 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-scripts" (OuterVolumeSpecName: "scripts") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.099847 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.102957 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7e15095-34a2-4ab7-a578-e77290116b58" (UID: "e7e15095-34a2-4ab7-a578-e77290116b58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.195145 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.195180 4849 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.195191 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.195200 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.195210 4849 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e7e15095-34a2-4ab7-a578-e77290116b58-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4849]: I0320 13:46:58.195219 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqzss\" (UniqueName: \"kubernetes.io/projected/e7e15095-34a2-4ab7-a578-e77290116b58-kube-api-access-hqzss\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.011388 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.048751 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee472bc4-f40a-41d9-b1e7-651f9c39eb27" path="/var/lib/kubelet/pods/ee472bc4-f40a-41d9-b1e7-651f9c39eb27/volumes" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.129886 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.149392 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.164455 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.173461 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.177286 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.177558 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.177695 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.206367 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.328853 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.328892 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.329059 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-scripts\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.329273 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6175e92-e13b-450c-82ca-c27cd658ba57-run-httpd\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.329347 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2nrk\" (UniqueName: \"kubernetes.io/projected/e6175e92-e13b-450c-82ca-c27cd658ba57-kube-api-access-q2nrk\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.329428 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6175e92-e13b-450c-82ca-c27cd658ba57-log-httpd\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.329521 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-config-data\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.329623 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432084 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2nrk\" (UniqueName: \"kubernetes.io/projected/e6175e92-e13b-450c-82ca-c27cd658ba57-kube-api-access-q2nrk\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432152 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6175e92-e13b-450c-82ca-c27cd658ba57-log-httpd\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432203 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-config-data\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432250 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432306 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432920 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432979 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-scripts\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.432855 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6175e92-e13b-450c-82ca-c27cd658ba57-log-httpd\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.433087 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6175e92-e13b-450c-82ca-c27cd658ba57-run-httpd\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.433427 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e6175e92-e13b-450c-82ca-c27cd658ba57-run-httpd\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.436699 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.438055 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-config-data\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.438192 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-scripts\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.438869 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.440543 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e6175e92-e13b-450c-82ca-c27cd658ba57-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.465321 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2nrk\" (UniqueName: \"kubernetes.io/projected/e6175e92-e13b-450c-82ca-c27cd658ba57-kube-api-access-q2nrk\") pod \"ceilometer-0\" (UID: \"e6175e92-e13b-450c-82ca-c27cd658ba57\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.511070 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4849]: I0320 13:46:59.771607 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4849]: W0320 13:46:59.777603 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6175e92_e13b_450c_82ca_c27cd658ba57.slice/crio-50d3c21a0820cadd977fd3779eea492355e78797c298e800dfed0f67156d3142 WatchSource:0}: Error finding container 50d3c21a0820cadd977fd3779eea492355e78797c298e800dfed0f67156d3142: Status 404 returned error can't find the container with id 50d3c21a0820cadd977fd3779eea492355e78797c298e800dfed0f67156d3142 Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.022001 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6175e92-e13b-450c-82ca-c27cd658ba57","Type":"ContainerStarted","Data":"50d3c21a0820cadd977fd3779eea492355e78797c298e800dfed0f67156d3142"} Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.616448 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.756220 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-combined-ca-bundle\") pod \"8f300511-17b3-4361-8694-3ba7b2d125ed\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.756623 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kttt4\" (UniqueName: \"kubernetes.io/projected/8f300511-17b3-4361-8694-3ba7b2d125ed-kube-api-access-kttt4\") pod \"8f300511-17b3-4361-8694-3ba7b2d125ed\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.756713 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f300511-17b3-4361-8694-3ba7b2d125ed-logs\") pod \"8f300511-17b3-4361-8694-3ba7b2d125ed\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.756868 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-config-data\") pod \"8f300511-17b3-4361-8694-3ba7b2d125ed\" (UID: \"8f300511-17b3-4361-8694-3ba7b2d125ed\") " Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.757210 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f300511-17b3-4361-8694-3ba7b2d125ed-logs" (OuterVolumeSpecName: "logs") pod "8f300511-17b3-4361-8694-3ba7b2d125ed" (UID: "8f300511-17b3-4361-8694-3ba7b2d125ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.757406 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f300511-17b3-4361-8694-3ba7b2d125ed-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.763947 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f300511-17b3-4361-8694-3ba7b2d125ed-kube-api-access-kttt4" (OuterVolumeSpecName: "kube-api-access-kttt4") pod "8f300511-17b3-4361-8694-3ba7b2d125ed" (UID: "8f300511-17b3-4361-8694-3ba7b2d125ed"). InnerVolumeSpecName "kube-api-access-kttt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.789854 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-config-data" (OuterVolumeSpecName: "config-data") pod "8f300511-17b3-4361-8694-3ba7b2d125ed" (UID: "8f300511-17b3-4361-8694-3ba7b2d125ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.799709 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f300511-17b3-4361-8694-3ba7b2d125ed" (UID: "8f300511-17b3-4361-8694-3ba7b2d125ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.858966 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.859005 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f300511-17b3-4361-8694-3ba7b2d125ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:00 crc kubenswrapper[4849]: I0320 13:47:00.859020 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kttt4\" (UniqueName: \"kubernetes.io/projected/8f300511-17b3-4361-8694-3ba7b2d125ed-kube-api-access-kttt4\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.033536 4849 generic.go:334] "Generic (PLEG): container finished" podID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerID="c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787" exitCode=0 Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.033596 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.033981 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8f300511-17b3-4361-8694-3ba7b2d125ed","Type":"ContainerDied","Data":"c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787"} Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.034054 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8f300511-17b3-4361-8694-3ba7b2d125ed","Type":"ContainerDied","Data":"f5fb8de51205d7246358d1298c6fad2c929a7f7676cf69443fa47f2a5ee0f3f9"} Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.034072 4849 scope.go:117] "RemoveContainer" containerID="c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.066805 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e15095-34a2-4ab7-a578-e77290116b58" path="/var/lib/kubelet/pods/e7e15095-34a2-4ab7-a578-e77290116b58/volumes" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.067539 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6175e92-e13b-450c-82ca-c27cd658ba57","Type":"ContainerStarted","Data":"7fbf4d9a348ca9eec26ead8d1d54fe0b775a80ffb20f43df23fadddb9165c1d4"} Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.071245 4849 scope.go:117] "RemoveContainer" containerID="a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.108393 4849 scope.go:117] "RemoveContainer" containerID="c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787" Mar 20 13:47:01 crc kubenswrapper[4849]: E0320 13:47:01.108883 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787\": container with ID starting with c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787 not found: ID does not exist" containerID="c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.108924 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787"} err="failed to get container status \"c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787\": rpc error: code = NotFound desc = could not find container \"c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787\": container with ID starting with c2a1206f74ff06cb272290e31e8d1d4433d77af4ce6d3d3015cfa8c8a81c4787 not found: ID does not exist" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.108944 4849 scope.go:117] "RemoveContainer" containerID="a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b" Mar 20 13:47:01 crc kubenswrapper[4849]: E0320 13:47:01.109185 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b\": container with ID starting with a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b not found: ID does not exist" containerID="a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.109206 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b"} err="failed to get container status \"a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b\": rpc error: code = NotFound desc = could not find container \"a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b\": container with ID starting with a93b1df0c508751765b124f9a364c74b1b9b7908c8eebffea0964ecd3a11d48b not found: ID does not exist" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.129889 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.138889 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.153298 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:01 crc kubenswrapper[4849]: E0320 13:47:01.153757 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-log" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.153781 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-log" Mar 20 13:47:01 crc kubenswrapper[4849]: E0320 13:47:01.153793 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-api" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.153800 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-api" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.154009 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-log" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.154037 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" containerName="nova-api-api" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.154981 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.157444 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.157717 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.157901 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.170391 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.270272 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eab88ade-f99b-4ca2-861f-611b50ff38c0-logs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.270337 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.270360 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.270450 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.270468 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-config-data\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.270482 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cflq\" (UniqueName: \"kubernetes.io/projected/eab88ade-f99b-4ca2-861f-611b50ff38c0-kube-api-access-9cflq\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.372426 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.372739 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-config-data\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.372756 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cflq\" (UniqueName: \"kubernetes.io/projected/eab88ade-f99b-4ca2-861f-611b50ff38c0-kube-api-access-9cflq\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.372865 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eab88ade-f99b-4ca2-861f-611b50ff38c0-logs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.372939 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.372966 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.374090 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eab88ade-f99b-4ca2-861f-611b50ff38c0-logs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.376562 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.377033 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-config-data\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.378025 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.378239 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.388754 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cflq\" (UniqueName: \"kubernetes.io/projected/eab88ade-f99b-4ca2-861f-611b50ff38c0-kube-api-access-9cflq\") pod \"nova-api-0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.484084 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.574167 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:01 crc kubenswrapper[4849]: I0320 13:47:01.609284 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.013347 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:02 crc kubenswrapper[4849]: W0320 13:47:02.014539 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab88ade_f99b_4ca2_861f_611b50ff38c0.slice/crio-3157869161b9c53bfcc7dd24170be14d14064f9e6a9f348e8936e7a4f4f2f579 WatchSource:0}: Error finding container 3157869161b9c53bfcc7dd24170be14d14064f9e6a9f348e8936e7a4f4f2f579: Status 404 returned error can't find the container with id 3157869161b9c53bfcc7dd24170be14d14064f9e6a9f348e8936e7a4f4f2f579 Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.046278 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6175e92-e13b-450c-82ca-c27cd658ba57","Type":"ContainerStarted","Data":"424781db82a2d0af9028a6f22e9c7b7d9eecf99ab80369400bb8f0ca8cda8c92"} Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.046327 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6175e92-e13b-450c-82ca-c27cd658ba57","Type":"ContainerStarted","Data":"a71ed7c92af22f42f0533f134565722a02ab147bd1fe619ea6192d92ab9da057"} Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.047473 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eab88ade-f99b-4ca2-861f-611b50ff38c0","Type":"ContainerStarted","Data":"3157869161b9c53bfcc7dd24170be14d14064f9e6a9f348e8936e7a4f4f2f579"} Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.062393 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.280836 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-qj5cd"] Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.282072 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.284616 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.286351 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.299214 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qj5cd"] Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.395320 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-config-data\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.395678 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-scripts\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.395737 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.395899 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47p4p\" (UniqueName: \"kubernetes.io/projected/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-kube-api-access-47p4p\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.497254 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-scripts\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.497305 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.497394 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47p4p\" (UniqueName: \"kubernetes.io/projected/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-kube-api-access-47p4p\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.497476 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-config-data\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.504371 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-config-data\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.504617 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.505283 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-scripts\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.515529 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47p4p\" (UniqueName: \"kubernetes.io/projected/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-kube-api-access-47p4p\") pod \"nova-cell1-cell-mapping-qj5cd\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:02 crc kubenswrapper[4849]: I0320 13:47:02.618501 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:03 crc kubenswrapper[4849]: I0320 13:47:03.049319 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f300511-17b3-4361-8694-3ba7b2d125ed" path="/var/lib/kubelet/pods/8f300511-17b3-4361-8694-3ba7b2d125ed/volumes" Mar 20 13:47:03 crc kubenswrapper[4849]: I0320 13:47:03.067446 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eab88ade-f99b-4ca2-861f-611b50ff38c0","Type":"ContainerStarted","Data":"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c"} Mar 20 13:47:03 crc kubenswrapper[4849]: I0320 13:47:03.067492 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eab88ade-f99b-4ca2-861f-611b50ff38c0","Type":"ContainerStarted","Data":"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36"} Mar 20 13:47:03 crc kubenswrapper[4849]: I0320 13:47:03.095338 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qj5cd"] Mar 20 13:47:03 crc kubenswrapper[4849]: I0320 13:47:03.098912 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.09889189 podStartE2EDuration="2.09889189s" podCreationTimestamp="2026-03-20 13:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:03.093079789 +0000 UTC m=+1372.770803204" watchObservedRunningTime="2026-03-20 13:47:03.09889189 +0000 UTC m=+1372.776615295" Mar 20 13:47:04 crc kubenswrapper[4849]: I0320 13:47:04.080296 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qj5cd" event={"ID":"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a","Type":"ContainerStarted","Data":"a16c7a41623c1fb9c6cf2fe796a15c5d8b11a66d2b6f2f290c8032c28871594a"} Mar 20 13:47:04 crc kubenswrapper[4849]: I0320 13:47:04.080627 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qj5cd" event={"ID":"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a","Type":"ContainerStarted","Data":"7dab3265ca9faf74e0c1242eb1b42ba1f6c949da5aa634a5eb0220b7a0cc6dd4"} Mar 20 13:47:04 crc kubenswrapper[4849]: I0320 13:47:04.110026 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-qj5cd" podStartSLOduration=2.110006214 podStartE2EDuration="2.110006214s" podCreationTimestamp="2026-03-20 13:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:04.098110554 +0000 UTC m=+1373.775833979" watchObservedRunningTime="2026-03-20 13:47:04.110006214 +0000 UTC m=+1373.787729619" Mar 20 13:47:04 crc kubenswrapper[4849]: I0320 13:47:04.463012 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-ns6mb" Mar 20 13:47:04 crc kubenswrapper[4849]: I0320 13:47:04.522430 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-t4j65"] Mar 20 13:47:04 crc kubenswrapper[4849]: I0320 13:47:04.522729 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" containerName="dnsmasq-dns" containerID="cri-o://fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5" gracePeriod=10 Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.044091 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.089221 4849 generic.go:334] "Generic (PLEG): container finished" podID="8e76c274-ce20-458a-a78e-84f736089dd1" containerID="fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5" exitCode=0 Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.089292 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" event={"ID":"8e76c274-ce20-458a-a78e-84f736089dd1","Type":"ContainerDied","Data":"fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5"} Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.089323 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" event={"ID":"8e76c274-ce20-458a-a78e-84f736089dd1","Type":"ContainerDied","Data":"81b8edb16c14aafab2290be1123ecf4284bd146f5d05164db422fa0bf53f3707"} Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.089343 4849 scope.go:117] "RemoveContainer" containerID="fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.089457 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-t4j65" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.093439 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e6175e92-e13b-450c-82ca-c27cd658ba57","Type":"ContainerStarted","Data":"82ac3fccc7f686a892f37e1678dd925390c8488b7db2b5c3eb7cbff56402dde2"} Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.093906 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.126885 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.067874214 podStartE2EDuration="6.126837746s" podCreationTimestamp="2026-03-20 13:46:59 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.780502993 +0000 UTC m=+1369.458226388" lastFinishedPulling="2026-03-20 13:47:03.839466515 +0000 UTC m=+1373.517189920" observedRunningTime="2026-03-20 13:47:05.115321596 +0000 UTC m=+1374.793045011" watchObservedRunningTime="2026-03-20 13:47:05.126837746 +0000 UTC m=+1374.804561161" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.139364 4849 scope.go:117] "RemoveContainer" containerID="ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.169925 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-sb\") pod \"8e76c274-ce20-458a-a78e-84f736089dd1\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.170055 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-swift-storage-0\") pod \"8e76c274-ce20-458a-a78e-84f736089dd1\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.170131 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-svc\") pod \"8e76c274-ce20-458a-a78e-84f736089dd1\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.170163 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-nb\") pod \"8e76c274-ce20-458a-a78e-84f736089dd1\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.170215 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-config\") pod \"8e76c274-ce20-458a-a78e-84f736089dd1\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.170265 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr9rg\" (UniqueName: \"kubernetes.io/projected/8e76c274-ce20-458a-a78e-84f736089dd1-kube-api-access-sr9rg\") pod \"8e76c274-ce20-458a-a78e-84f736089dd1\" (UID: \"8e76c274-ce20-458a-a78e-84f736089dd1\") " Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.183031 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e76c274-ce20-458a-a78e-84f736089dd1-kube-api-access-sr9rg" (OuterVolumeSpecName: "kube-api-access-sr9rg") pod "8e76c274-ce20-458a-a78e-84f736089dd1" (UID: "8e76c274-ce20-458a-a78e-84f736089dd1"). InnerVolumeSpecName "kube-api-access-sr9rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.187069 4849 scope.go:117] "RemoveContainer" containerID="fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5" Mar 20 13:47:05 crc kubenswrapper[4849]: E0320 13:47:05.187620 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5\": container with ID starting with fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5 not found: ID does not exist" containerID="fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.187649 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5"} err="failed to get container status \"fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5\": rpc error: code = NotFound desc = could not find container \"fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5\": container with ID starting with fcdb37f625cfd2bc7d1c4eea5ef468bea708b928224a7340b7129f7b9467a5f5 not found: ID does not exist" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.187669 4849 scope.go:117] "RemoveContainer" containerID="ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1" Mar 20 13:47:05 crc kubenswrapper[4849]: E0320 13:47:05.188159 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1\": container with ID starting with ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1 not found: ID does not exist" containerID="ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.188180 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1"} err="failed to get container status \"ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1\": rpc error: code = NotFound desc = could not find container \"ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1\": container with ID starting with ccd1df54dc8b3867bd3736ea2ff896270d5fc946cb6b35528ce0f9430a2d49f1 not found: ID does not exist" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.236666 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e76c274-ce20-458a-a78e-84f736089dd1" (UID: "8e76c274-ce20-458a-a78e-84f736089dd1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.238545 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-config" (OuterVolumeSpecName: "config") pod "8e76c274-ce20-458a-a78e-84f736089dd1" (UID: "8e76c274-ce20-458a-a78e-84f736089dd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.248369 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e76c274-ce20-458a-a78e-84f736089dd1" (UID: "8e76c274-ce20-458a-a78e-84f736089dd1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.250729 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8e76c274-ce20-458a-a78e-84f736089dd1" (UID: "8e76c274-ce20-458a-a78e-84f736089dd1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.266345 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e76c274-ce20-458a-a78e-84f736089dd1" (UID: "8e76c274-ce20-458a-a78e-84f736089dd1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.273070 4849 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.273103 4849 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.273113 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.273122 4849 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.273131 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr9rg\" (UniqueName: \"kubernetes.io/projected/8e76c274-ce20-458a-a78e-84f736089dd1-kube-api-access-sr9rg\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.273143 4849 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e76c274-ce20-458a-a78e-84f736089dd1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.431005 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-t4j65"] Mar 20 13:47:05 crc kubenswrapper[4849]: I0320 13:47:05.439777 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-t4j65"] Mar 20 13:47:07 crc kubenswrapper[4849]: I0320 13:47:07.048011 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" path="/var/lib/kubelet/pods/8e76c274-ce20-458a-a78e-84f736089dd1/volumes" Mar 20 13:47:08 crc kubenswrapper[4849]: I0320 13:47:08.126903 4849 generic.go:334] "Generic (PLEG): container finished" podID="e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" containerID="a16c7a41623c1fb9c6cf2fe796a15c5d8b11a66d2b6f2f290c8032c28871594a" exitCode=0 Mar 20 13:47:08 crc kubenswrapper[4849]: I0320 13:47:08.127011 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qj5cd" event={"ID":"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a","Type":"ContainerDied","Data":"a16c7a41623c1fb9c6cf2fe796a15c5d8b11a66d2b6f2f290c8032c28871594a"} Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.385387 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.386021 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.550366 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.654192 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-scripts\") pod \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.654280 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-config-data\") pod \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.654346 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-combined-ca-bundle\") pod \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.654409 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47p4p\" (UniqueName: \"kubernetes.io/projected/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-kube-api-access-47p4p\") pod \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\" (UID: \"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a\") " Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.660547 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-scripts" (OuterVolumeSpecName: "scripts") pod "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" (UID: "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.663438 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-kube-api-access-47p4p" (OuterVolumeSpecName: "kube-api-access-47p4p") pod "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" (UID: "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a"). InnerVolumeSpecName "kube-api-access-47p4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.681909 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-config-data" (OuterVolumeSpecName: "config-data") pod "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" (UID: "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.683240 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" (UID: "e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.757221 4849 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.757252 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.757261 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4849]: I0320 13:47:09.757273 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47p4p\" (UniqueName: \"kubernetes.io/projected/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a-kube-api-access-47p4p\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.146762 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qj5cd" event={"ID":"e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a","Type":"ContainerDied","Data":"7dab3265ca9faf74e0c1242eb1b42ba1f6c949da5aa634a5eb0220b7a0cc6dd4"} Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.146806 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dab3265ca9faf74e0c1242eb1b42ba1f6c949da5aa634a5eb0220b7a0cc6dd4" Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.146781 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qj5cd" Mar 20 13:47:10 crc kubenswrapper[4849]: E0320 13:47:10.299533 4849 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode40d1b51_4a54_41f3_bc73_2e9c4e6dff1a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode40d1b51_4a54_41f3_bc73_2e9c4e6dff1a.slice/crio-7dab3265ca9faf74e0c1242eb1b42ba1f6c949da5aa634a5eb0220b7a0cc6dd4\": RecentStats: unable to find data in memory cache]" Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.323864 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.324348 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="80d41c24-8a33-4183-9397-f46556219054" containerName="nova-scheduler-scheduler" containerID="cri-o://9f330ca93994e4ddab6a0b638127fa00003b61328412a4deeb92d34683743289" gracePeriod=30 Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.340039 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.340283 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-log" containerID="cri-o://cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36" gracePeriod=30 Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.340356 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-api" containerID="cri-o://ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c" gracePeriod=30 Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.362586 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.362807 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-log" containerID="cri-o://05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb" gracePeriod=30 Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.362927 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-metadata" containerID="cri-o://b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d" gracePeriod=30 Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.897903 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.976080 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-config-data\") pod \"eab88ade-f99b-4ca2-861f-611b50ff38c0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.976227 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-internal-tls-certs\") pod \"eab88ade-f99b-4ca2-861f-611b50ff38c0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.976275 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-combined-ca-bundle\") pod \"eab88ade-f99b-4ca2-861f-611b50ff38c0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.976397 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-public-tls-certs\") pod \"eab88ade-f99b-4ca2-861f-611b50ff38c0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.976439 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eab88ade-f99b-4ca2-861f-611b50ff38c0-logs\") pod \"eab88ade-f99b-4ca2-861f-611b50ff38c0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.976482 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cflq\" (UniqueName: \"kubernetes.io/projected/eab88ade-f99b-4ca2-861f-611b50ff38c0-kube-api-access-9cflq\") pod \"eab88ade-f99b-4ca2-861f-611b50ff38c0\" (UID: \"eab88ade-f99b-4ca2-861f-611b50ff38c0\") " Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.977090 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab88ade-f99b-4ca2-861f-611b50ff38c0-logs" (OuterVolumeSpecName: "logs") pod "eab88ade-f99b-4ca2-861f-611b50ff38c0" (UID: "eab88ade-f99b-4ca2-861f-611b50ff38c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:10 crc kubenswrapper[4849]: I0320 13:47:10.991869 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab88ade-f99b-4ca2-861f-611b50ff38c0-kube-api-access-9cflq" (OuterVolumeSpecName: "kube-api-access-9cflq") pod "eab88ade-f99b-4ca2-861f-611b50ff38c0" (UID: "eab88ade-f99b-4ca2-861f-611b50ff38c0"). InnerVolumeSpecName "kube-api-access-9cflq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.008806 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eab88ade-f99b-4ca2-861f-611b50ff38c0" (UID: "eab88ade-f99b-4ca2-861f-611b50ff38c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.028132 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eab88ade-f99b-4ca2-861f-611b50ff38c0" (UID: "eab88ade-f99b-4ca2-861f-611b50ff38c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.034788 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eab88ade-f99b-4ca2-861f-611b50ff38c0" (UID: "eab88ade-f99b-4ca2-861f-611b50ff38c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.039782 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-config-data" (OuterVolumeSpecName: "config-data") pod "eab88ade-f99b-4ca2-861f-611b50ff38c0" (UID: "eab88ade-f99b-4ca2-861f-611b50ff38c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.079134 4849 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.079175 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eab88ade-f99b-4ca2-861f-611b50ff38c0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.079191 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cflq\" (UniqueName: \"kubernetes.io/projected/eab88ade-f99b-4ca2-861f-611b50ff38c0-kube-api-access-9cflq\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.079223 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.079236 4849 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.079247 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab88ade-f99b-4ca2-861f-611b50ff38c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.157124 4849 generic.go:334] "Generic (PLEG): container finished" podID="80d41c24-8a33-4183-9397-f46556219054" containerID="9f330ca93994e4ddab6a0b638127fa00003b61328412a4deeb92d34683743289" exitCode=0 Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.157160 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d41c24-8a33-4183-9397-f46556219054","Type":"ContainerDied","Data":"9f330ca93994e4ddab6a0b638127fa00003b61328412a4deeb92d34683743289"} Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160146 4849 generic.go:334] "Generic (PLEG): container finished" podID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerID="ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c" exitCode=0 Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160165 4849 generic.go:334] "Generic (PLEG): container finished" podID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerID="cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36" exitCode=143 Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160200 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eab88ade-f99b-4ca2-861f-611b50ff38c0","Type":"ContainerDied","Data":"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c"} Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160216 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eab88ade-f99b-4ca2-861f-611b50ff38c0","Type":"ContainerDied","Data":"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36"} Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160225 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eab88ade-f99b-4ca2-861f-611b50ff38c0","Type":"ContainerDied","Data":"3157869161b9c53bfcc7dd24170be14d14064f9e6a9f348e8936e7a4f4f2f579"} Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160241 4849 scope.go:117] "RemoveContainer" containerID="ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.160273 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.166110 4849 generic.go:334] "Generic (PLEG): container finished" podID="07a19c91-f95f-456b-a8dd-52743845e141" containerID="05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb" exitCode=143 Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.166144 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07a19c91-f95f-456b-a8dd-52743845e141","Type":"ContainerDied","Data":"05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb"} Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.193185 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.204359 4849 scope.go:117] "RemoveContainer" containerID="cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.205354 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.215969 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.216388 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" containerName="nova-manage" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216401 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" containerName="nova-manage" Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.216414 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" containerName="dnsmasq-dns" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216419 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" containerName="dnsmasq-dns" Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.216447 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-log" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216455 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-log" Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.216470 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" containerName="init" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216475 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" containerName="init" Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.216488 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-api" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216493 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-api" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216654 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-log" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216672 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" containerName="nova-manage" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216687 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e76c274-ce20-458a-a78e-84f736089dd1" containerName="dnsmasq-dns" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.216697 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" containerName="nova-api-api" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.217717 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.220387 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.220568 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.220838 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.227441 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.229972 4849 scope.go:117] "RemoveContainer" containerID="ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c" Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.235928 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c\": container with ID starting with ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c not found: ID does not exist" containerID="ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.235989 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c"} err="failed to get container status \"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c\": rpc error: code = NotFound desc = could not find container \"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c\": container with ID starting with ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c not found: ID does not exist" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.236023 4849 scope.go:117] "RemoveContainer" containerID="cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36" Mar 20 13:47:11 crc kubenswrapper[4849]: E0320 13:47:11.239448 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36\": container with ID starting with cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36 not found: ID does not exist" containerID="cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.239500 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36"} err="failed to get container status \"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36\": rpc error: code = NotFound desc = could not find container \"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36\": container with ID starting with cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36 not found: ID does not exist" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.239536 4849 scope.go:117] "RemoveContainer" containerID="ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.242999 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c"} err="failed to get container status \"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c\": rpc error: code = NotFound desc = could not find container \"ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c\": container with ID starting with ae7a37e1b61be8a511fca2c2bd21050cc904abb805d65ee36733f6d50eadb05c not found: ID does not exist" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.243127 4849 scope.go:117] "RemoveContainer" containerID="cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.243573 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36"} err="failed to get container status \"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36\": rpc error: code = NotFound desc = could not find container \"cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36\": container with ID starting with cbd09ce551e7a150f753e39e1b19954725057b181557144215d0fc0375d71b36 not found: ID does not exist" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.282328 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-internal-tls-certs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.282437 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6ggb\" (UniqueName: \"kubernetes.io/projected/30ed0b8d-41c8-4175-a998-6109bc3ec143-kube-api-access-m6ggb\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.282474 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-config-data\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.282569 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ed0b8d-41c8-4175-a998-6109bc3ec143-logs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.282958 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-public-tls-certs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.283033 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.384864 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-public-tls-certs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.384940 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.384965 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-internal-tls-certs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.385018 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6ggb\" (UniqueName: \"kubernetes.io/projected/30ed0b8d-41c8-4175-a998-6109bc3ec143-kube-api-access-m6ggb\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.385070 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-config-data\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.385128 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ed0b8d-41c8-4175-a998-6109bc3ec143-logs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.385544 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30ed0b8d-41c8-4175-a998-6109bc3ec143-logs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.389938 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.390397 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-public-tls-certs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.390805 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-config-data\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.390917 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30ed0b8d-41c8-4175-a998-6109bc3ec143-internal-tls-certs\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.401098 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6ggb\" (UniqueName: \"kubernetes.io/projected/30ed0b8d-41c8-4175-a998-6109bc3ec143-kube-api-access-m6ggb\") pod \"nova-api-0\" (UID: \"30ed0b8d-41c8-4175-a998-6109bc3ec143\") " pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.451058 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.486840 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-config-data\") pod \"80d41c24-8a33-4183-9397-f46556219054\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.486946 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbz6j\" (UniqueName: \"kubernetes.io/projected/80d41c24-8a33-4183-9397-f46556219054-kube-api-access-xbz6j\") pod \"80d41c24-8a33-4183-9397-f46556219054\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.487065 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-combined-ca-bundle\") pod \"80d41c24-8a33-4183-9397-f46556219054\" (UID: \"80d41c24-8a33-4183-9397-f46556219054\") " Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.501361 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d41c24-8a33-4183-9397-f46556219054-kube-api-access-xbz6j" (OuterVolumeSpecName: "kube-api-access-xbz6j") pod "80d41c24-8a33-4183-9397-f46556219054" (UID: "80d41c24-8a33-4183-9397-f46556219054"). InnerVolumeSpecName "kube-api-access-xbz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.526211 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-config-data" (OuterVolumeSpecName: "config-data") pod "80d41c24-8a33-4183-9397-f46556219054" (UID: "80d41c24-8a33-4183-9397-f46556219054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.540999 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.545392 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80d41c24-8a33-4183-9397-f46556219054" (UID: "80d41c24-8a33-4183-9397-f46556219054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.592839 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.592901 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbz6j\" (UniqueName: \"kubernetes.io/projected/80d41c24-8a33-4183-9397-f46556219054-kube-api-access-xbz6j\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.592915 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d41c24-8a33-4183-9397-f46556219054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:11 crc kubenswrapper[4849]: I0320 13:47:11.993104 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.186584 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ed0b8d-41c8-4175-a998-6109bc3ec143","Type":"ContainerStarted","Data":"8e23c84571ffe2ca713166f4a26bbd5a081d48ccbfa53402d5d6bd4e5a0811c0"} Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.198291 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d41c24-8a33-4183-9397-f46556219054","Type":"ContainerDied","Data":"43f7524f23decfa3241edfa115417c39f277e21eaeae7d1836bc982dd68818fc"} Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.198353 4849 scope.go:117] "RemoveContainer" containerID="9f330ca93994e4ddab6a0b638127fa00003b61328412a4deeb92d34683743289" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.198514 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.302885 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.327886 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.342271 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:12 crc kubenswrapper[4849]: E0320 13:47:12.342685 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d41c24-8a33-4183-9397-f46556219054" containerName="nova-scheduler-scheduler" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.342704 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d41c24-8a33-4183-9397-f46556219054" containerName="nova-scheduler-scheduler" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.342940 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d41c24-8a33-4183-9397-f46556219054" containerName="nova-scheduler-scheduler" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.343533 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.346625 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.350651 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.410075 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a506df4-da99-4034-91dc-ede44412cebf-config-data\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.410526 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhhwc\" (UniqueName: \"kubernetes.io/projected/4a506df4-da99-4034-91dc-ede44412cebf-kube-api-access-jhhwc\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.410630 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a506df4-da99-4034-91dc-ede44412cebf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.512256 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a506df4-da99-4034-91dc-ede44412cebf-config-data\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.512372 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhhwc\" (UniqueName: \"kubernetes.io/projected/4a506df4-da99-4034-91dc-ede44412cebf-kube-api-access-jhhwc\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.512450 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a506df4-da99-4034-91dc-ede44412cebf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.517130 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a506df4-da99-4034-91dc-ede44412cebf-config-data\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.519196 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a506df4-da99-4034-91dc-ede44412cebf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.532447 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhhwc\" (UniqueName: \"kubernetes.io/projected/4a506df4-da99-4034-91dc-ede44412cebf-kube-api-access-jhhwc\") pod \"nova-scheduler-0\" (UID: \"4a506df4-da99-4034-91dc-ede44412cebf\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:12 crc kubenswrapper[4849]: I0320 13:47:12.659253 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.046010 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d41c24-8a33-4183-9397-f46556219054" path="/var/lib/kubelet/pods/80d41c24-8a33-4183-9397-f46556219054/volumes" Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.047060 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab88ade-f99b-4ca2-861f-611b50ff38c0" path="/var/lib/kubelet/pods/eab88ade-f99b-4ca2-861f-611b50ff38c0/volumes" Mar 20 13:47:13 crc kubenswrapper[4849]: W0320 13:47:13.089704 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a506df4_da99_4034_91dc_ede44412cebf.slice/crio-e6b4b608afd131c1fc2087d3ec45bd6a09d09c6de5b77b3f4f5150830c801256 WatchSource:0}: Error finding container e6b4b608afd131c1fc2087d3ec45bd6a09d09c6de5b77b3f4f5150830c801256: Status 404 returned error can't find the container with id e6b4b608afd131c1fc2087d3ec45bd6a09d09c6de5b77b3f4f5150830c801256 Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.094514 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.218601 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a506df4-da99-4034-91dc-ede44412cebf","Type":"ContainerStarted","Data":"e6b4b608afd131c1fc2087d3ec45bd6a09d09c6de5b77b3f4f5150830c801256"} Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.219954 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ed0b8d-41c8-4175-a998-6109bc3ec143","Type":"ContainerStarted","Data":"e6c1d5db494e2c4fb72ad7da07fdcd9189a89bd7788dbc01c4cca85cfb4e53f3"} Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.219982 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"30ed0b8d-41c8-4175-a998-6109bc3ec143","Type":"ContainerStarted","Data":"4e368d680686acffee83a9f1afe6feda917b4f903b565da56f0f0195f0898967"} Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.266107 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.266086647 podStartE2EDuration="2.266086647s" podCreationTimestamp="2026-03-20 13:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:13.238737924 +0000 UTC m=+1382.916461329" watchObservedRunningTime="2026-03-20 13:47:13.266086647 +0000 UTC m=+1382.943810042" Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.923971 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.936190 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw5bm\" (UniqueName: \"kubernetes.io/projected/07a19c91-f95f-456b-a8dd-52743845e141-kube-api-access-gw5bm\") pod \"07a19c91-f95f-456b-a8dd-52743845e141\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.936246 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-nova-metadata-tls-certs\") pod \"07a19c91-f95f-456b-a8dd-52743845e141\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.936293 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-combined-ca-bundle\") pod \"07a19c91-f95f-456b-a8dd-52743845e141\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.936451 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-config-data\") pod \"07a19c91-f95f-456b-a8dd-52743845e141\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.936589 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a19c91-f95f-456b-a8dd-52743845e141-logs\") pod \"07a19c91-f95f-456b-a8dd-52743845e141\" (UID: \"07a19c91-f95f-456b-a8dd-52743845e141\") " Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.937476 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a19c91-f95f-456b-a8dd-52743845e141-logs" (OuterVolumeSpecName: "logs") pod "07a19c91-f95f-456b-a8dd-52743845e141" (UID: "07a19c91-f95f-456b-a8dd-52743845e141"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:13 crc kubenswrapper[4849]: I0320 13:47:13.945413 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a19c91-f95f-456b-a8dd-52743845e141-kube-api-access-gw5bm" (OuterVolumeSpecName: "kube-api-access-gw5bm") pod "07a19c91-f95f-456b-a8dd-52743845e141" (UID: "07a19c91-f95f-456b-a8dd-52743845e141"). InnerVolumeSpecName "kube-api-access-gw5bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.003070 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07a19c91-f95f-456b-a8dd-52743845e141" (UID: "07a19c91-f95f-456b-a8dd-52743845e141"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.015157 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-config-data" (OuterVolumeSpecName: "config-data") pod "07a19c91-f95f-456b-a8dd-52743845e141" (UID: "07a19c91-f95f-456b-a8dd-52743845e141"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.033592 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "07a19c91-f95f-456b-a8dd-52743845e141" (UID: "07a19c91-f95f-456b-a8dd-52743845e141"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.039280 4849 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.039318 4849 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07a19c91-f95f-456b-a8dd-52743845e141-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.039333 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw5bm\" (UniqueName: \"kubernetes.io/projected/07a19c91-f95f-456b-a8dd-52743845e141-kube-api-access-gw5bm\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.039345 4849 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.039357 4849 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07a19c91-f95f-456b-a8dd-52743845e141-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.240487 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a506df4-da99-4034-91dc-ede44412cebf","Type":"ContainerStarted","Data":"ef4d5d43ad451f1f01cc8f2e1d87e0f595efb0d1509aa197cc7a3ce9ac088250"} Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.246788 4849 generic.go:334] "Generic (PLEG): container finished" podID="07a19c91-f95f-456b-a8dd-52743845e141" containerID="b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d" exitCode=0 Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.246908 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.247051 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07a19c91-f95f-456b-a8dd-52743845e141","Type":"ContainerDied","Data":"b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d"} Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.247130 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07a19c91-f95f-456b-a8dd-52743845e141","Type":"ContainerDied","Data":"db59972bb294284a09fd804b2a49006d8d9a8550d56eb9a0218d0874434d330c"} Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.247164 4849 scope.go:117] "RemoveContainer" containerID="b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.257970 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.257944568 podStartE2EDuration="2.257944568s" podCreationTimestamp="2026-03-20 13:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:14.255549535 +0000 UTC m=+1383.933272940" watchObservedRunningTime="2026-03-20 13:47:14.257944568 +0000 UTC m=+1383.935667973" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.287929 4849 scope.go:117] "RemoveContainer" containerID="05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.300035 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.332029 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.335721 4849 scope.go:117] "RemoveContainer" containerID="b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d" Mar 20 13:47:14 crc kubenswrapper[4849]: E0320 13:47:14.336151 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d\": container with ID starting with b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d not found: ID does not exist" containerID="b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.336187 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d"} err="failed to get container status \"b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d\": rpc error: code = NotFound desc = could not find container \"b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d\": container with ID starting with b3773be8263044d7301ddda6c3de854586646cecde9e61756881c3b9da6de43d not found: ID does not exist" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.336211 4849 scope.go:117] "RemoveContainer" containerID="05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb" Mar 20 13:47:14 crc kubenswrapper[4849]: E0320 13:47:14.336494 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb\": container with ID starting with 05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb not found: ID does not exist" containerID="05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.336520 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb"} err="failed to get container status \"05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb\": rpc error: code = NotFound desc = could not find container \"05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb\": container with ID starting with 05fd4d1ded20fcbf6e07cc80b9d6e8dd52fd402254c442c22006fca7431298eb not found: ID does not exist" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.348885 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:14 crc kubenswrapper[4849]: E0320 13:47:14.349336 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-metadata" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.349366 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-metadata" Mar 20 13:47:14 crc kubenswrapper[4849]: E0320 13:47:14.349406 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-log" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.349417 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-log" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.349591 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-metadata" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.349620 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a19c91-f95f-456b-a8dd-52743845e141" containerName="nova-metadata-log" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.350619 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.353139 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.353471 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.359134 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.548004 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.548077 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.548142 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9396914f-9752-4915-ae6d-d2273935d774-logs\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.548283 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6szd\" (UniqueName: \"kubernetes.io/projected/9396914f-9752-4915-ae6d-d2273935d774-kube-api-access-v6szd\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.548397 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-config-data\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.650285 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.650396 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9396914f-9752-4915-ae6d-d2273935d774-logs\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.650437 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6szd\" (UniqueName: \"kubernetes.io/projected/9396914f-9752-4915-ae6d-d2273935d774-kube-api-access-v6szd\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.650477 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-config-data\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.650530 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.651067 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9396914f-9752-4915-ae6d-d2273935d774-logs\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.654410 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.657242 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.657450 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9396914f-9752-4915-ae6d-d2273935d774-config-data\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.667503 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6szd\" (UniqueName: \"kubernetes.io/projected/9396914f-9752-4915-ae6d-d2273935d774-kube-api-access-v6szd\") pod \"nova-metadata-0\" (UID: \"9396914f-9752-4915-ae6d-d2273935d774\") " pod="openstack/nova-metadata-0" Mar 20 13:47:14 crc kubenswrapper[4849]: I0320 13:47:14.679991 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:15 crc kubenswrapper[4849]: I0320 13:47:15.047730 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a19c91-f95f-456b-a8dd-52743845e141" path="/var/lib/kubelet/pods/07a19c91-f95f-456b-a8dd-52743845e141/volumes" Mar 20 13:47:15 crc kubenswrapper[4849]: I0320 13:47:15.175146 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:15 crc kubenswrapper[4849]: W0320 13:47:15.178390 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9396914f_9752_4915_ae6d_d2273935d774.slice/crio-a8db00db80b7d22e4dd5add3fc92fac3ad6417da9cd1e0fb178f906de757a4af WatchSource:0}: Error finding container a8db00db80b7d22e4dd5add3fc92fac3ad6417da9cd1e0fb178f906de757a4af: Status 404 returned error can't find the container with id a8db00db80b7d22e4dd5add3fc92fac3ad6417da9cd1e0fb178f906de757a4af Mar 20 13:47:15 crc kubenswrapper[4849]: I0320 13:47:15.261245 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9396914f-9752-4915-ae6d-d2273935d774","Type":"ContainerStarted","Data":"a8db00db80b7d22e4dd5add3fc92fac3ad6417da9cd1e0fb178f906de757a4af"} Mar 20 13:47:16 crc kubenswrapper[4849]: I0320 13:47:16.275972 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9396914f-9752-4915-ae6d-d2273935d774","Type":"ContainerStarted","Data":"61a78276b3203e8466504852b6a7a253cb1b5888abf8456d5ce89a1918daba1d"} Mar 20 13:47:16 crc kubenswrapper[4849]: I0320 13:47:16.276299 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9396914f-9752-4915-ae6d-d2273935d774","Type":"ContainerStarted","Data":"070777769e1682b8332e15f5994047c9d0ebbf8b136673c2ac5d80688828b977"} Mar 20 13:47:16 crc kubenswrapper[4849]: I0320 13:47:16.305929 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.305908386 podStartE2EDuration="2.305908386s" podCreationTimestamp="2026-03-20 13:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:16.303893604 +0000 UTC m=+1385.981617039" watchObservedRunningTime="2026-03-20 13:47:16.305908386 +0000 UTC m=+1385.983631801" Mar 20 13:47:17 crc kubenswrapper[4849]: I0320 13:47:17.660424 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:47:21 crc kubenswrapper[4849]: I0320 13:47:21.541415 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:47:21 crc kubenswrapper[4849]: I0320 13:47:21.542129 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:47:22 crc kubenswrapper[4849]: I0320 13:47:22.593032 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30ed0b8d-41c8-4175-a998-6109bc3ec143" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:22 crc kubenswrapper[4849]: I0320 13:47:22.593261 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="30ed0b8d-41c8-4175-a998-6109bc3ec143" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:22 crc kubenswrapper[4849]: I0320 13:47:22.660985 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:47:22 crc kubenswrapper[4849]: I0320 13:47:22.688570 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:47:23 crc kubenswrapper[4849]: I0320 13:47:23.374346 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:47:24 crc kubenswrapper[4849]: I0320 13:47:24.680857 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:47:24 crc kubenswrapper[4849]: I0320 13:47:24.681167 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:47:25 crc kubenswrapper[4849]: I0320 13:47:25.692109 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9396914f-9752-4915-ae6d-d2273935d774" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:25 crc kubenswrapper[4849]: I0320 13:47:25.692207 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9396914f-9752-4915-ae6d-d2273935d774" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:29 crc kubenswrapper[4849]: I0320 13:47:29.525521 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:47:29 crc kubenswrapper[4849]: I0320 13:47:29.541999 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4849]: I0320 13:47:29.542051 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:47:31 crc kubenswrapper[4849]: I0320 13:47:31.549718 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:47:31 crc kubenswrapper[4849]: I0320 13:47:31.555683 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:47:31 crc kubenswrapper[4849]: I0320 13:47:31.558487 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:47:32 crc kubenswrapper[4849]: I0320 13:47:32.444214 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:47:32 crc kubenswrapper[4849]: I0320 13:47:32.680739 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:47:32 crc kubenswrapper[4849]: I0320 13:47:32.680808 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:47:34 crc kubenswrapper[4849]: I0320 13:47:34.686545 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:47:34 crc kubenswrapper[4849]: I0320 13:47:34.692528 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:47:34 crc kubenswrapper[4849]: I0320 13:47:34.692852 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:47:35 crc kubenswrapper[4849]: I0320 13:47:35.469901 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:47:39 crc kubenswrapper[4849]: I0320 13:47:39.384547 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:47:39 crc kubenswrapper[4849]: I0320 13:47:39.384909 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:47:39 crc kubenswrapper[4849]: I0320 13:47:39.384962 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:47:39 crc kubenswrapper[4849]: I0320 13:47:39.385619 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a773bf9c49237a354678cd3d44df741a149b88bc3cf8989ae80a8e48fb75b7b"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:47:39 crc kubenswrapper[4849]: I0320 13:47:39.385707 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://7a773bf9c49237a354678cd3d44df741a149b88bc3cf8989ae80a8e48fb75b7b" gracePeriod=600 Mar 20 13:47:40 crc kubenswrapper[4849]: I0320 13:47:40.522350 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="7a773bf9c49237a354678cd3d44df741a149b88bc3cf8989ae80a8e48fb75b7b" exitCode=0 Mar 20 13:47:40 crc kubenswrapper[4849]: I0320 13:47:40.522847 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"7a773bf9c49237a354678cd3d44df741a149b88bc3cf8989ae80a8e48fb75b7b"} Mar 20 13:47:40 crc kubenswrapper[4849]: I0320 13:47:40.522878 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0"} Mar 20 13:47:40 crc kubenswrapper[4849]: I0320 13:47:40.522896 4849 scope.go:117] "RemoveContainer" containerID="320fbdc873fdc9693c329a47d54d9c46e735feb487e1c2d7c4da734e3de67821" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.139707 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566908-fp2m7"] Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.141762 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.145504 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.145787 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.145962 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.150991 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-fp2m7"] Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.231865 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9bnx\" (UniqueName: \"kubernetes.io/projected/3c6bf630-68ee-40f6-831b-feb110e2bc2e-kube-api-access-d9bnx\") pod \"auto-csr-approver-29566908-fp2m7\" (UID: \"3c6bf630-68ee-40f6-831b-feb110e2bc2e\") " pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.333489 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9bnx\" (UniqueName: \"kubernetes.io/projected/3c6bf630-68ee-40f6-831b-feb110e2bc2e-kube-api-access-d9bnx\") pod \"auto-csr-approver-29566908-fp2m7\" (UID: \"3c6bf630-68ee-40f6-831b-feb110e2bc2e\") " pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.358255 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9bnx\" (UniqueName: \"kubernetes.io/projected/3c6bf630-68ee-40f6-831b-feb110e2bc2e-kube-api-access-d9bnx\") pod \"auto-csr-approver-29566908-fp2m7\" (UID: \"3c6bf630-68ee-40f6-831b-feb110e2bc2e\") " pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.464169 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:00 crc kubenswrapper[4849]: I0320 13:48:00.913095 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-fp2m7"] Mar 20 13:48:01 crc kubenswrapper[4849]: I0320 13:48:01.732975 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" event={"ID":"3c6bf630-68ee-40f6-831b-feb110e2bc2e","Type":"ContainerStarted","Data":"41fe6bdb932a67a1195f7df5506be549b4e30db805c0bfd31428719b7223a3c3"} Mar 20 13:48:02 crc kubenswrapper[4849]: I0320 13:48:02.780748 4849 generic.go:334] "Generic (PLEG): container finished" podID="3c6bf630-68ee-40f6-831b-feb110e2bc2e" containerID="23d12cc0933c0ce1719047e9148b5861d1359d7aacd36f1966df954a69ecf56f" exitCode=0 Mar 20 13:48:02 crc kubenswrapper[4849]: I0320 13:48:02.780845 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" event={"ID":"3c6bf630-68ee-40f6-831b-feb110e2bc2e","Type":"ContainerDied","Data":"23d12cc0933c0ce1719047e9148b5861d1359d7aacd36f1966df954a69ecf56f"} Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.093774 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.199454 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9bnx\" (UniqueName: \"kubernetes.io/projected/3c6bf630-68ee-40f6-831b-feb110e2bc2e-kube-api-access-d9bnx\") pod \"3c6bf630-68ee-40f6-831b-feb110e2bc2e\" (UID: \"3c6bf630-68ee-40f6-831b-feb110e2bc2e\") " Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.205635 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6bf630-68ee-40f6-831b-feb110e2bc2e-kube-api-access-d9bnx" (OuterVolumeSpecName: "kube-api-access-d9bnx") pod "3c6bf630-68ee-40f6-831b-feb110e2bc2e" (UID: "3c6bf630-68ee-40f6-831b-feb110e2bc2e"). InnerVolumeSpecName "kube-api-access-d9bnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.301453 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9bnx\" (UniqueName: \"kubernetes.io/projected/3c6bf630-68ee-40f6-831b-feb110e2bc2e-kube-api-access-d9bnx\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.808417 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" event={"ID":"3c6bf630-68ee-40f6-831b-feb110e2bc2e","Type":"ContainerDied","Data":"41fe6bdb932a67a1195f7df5506be549b4e30db805c0bfd31428719b7223a3c3"} Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.808494 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41fe6bdb932a67a1195f7df5506be549b4e30db805c0bfd31428719b7223a3c3" Mar 20 13:48:04 crc kubenswrapper[4849]: I0320 13:48:04.809014 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-fp2m7" Mar 20 13:48:05 crc kubenswrapper[4849]: I0320 13:48:05.176692 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-hv5cs"] Mar 20 13:48:05 crc kubenswrapper[4849]: I0320 13:48:05.184249 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-hv5cs"] Mar 20 13:48:07 crc kubenswrapper[4849]: I0320 13:48:07.046020 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73c059a-d570-4e73-8119-9f14474c9d99" path="/var/lib/kubelet/pods/a73c059a-d570-4e73-8119-9f14474c9d99/volumes" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.067047 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cj48j/must-gather-tdtwd"] Mar 20 13:48:14 crc kubenswrapper[4849]: E0320 13:48:14.068056 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6bf630-68ee-40f6-831b-feb110e2bc2e" containerName="oc" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.068074 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6bf630-68ee-40f6-831b-feb110e2bc2e" containerName="oc" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.068332 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6bf630-68ee-40f6-831b-feb110e2bc2e" containerName="oc" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.073044 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.078267 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cj48j"/"openshift-service-ca.crt" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.078648 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cj48j"/"kube-root-ca.crt" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.109724 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cj48j/must-gather-tdtwd"] Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.190961 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n65z\" (UniqueName: \"kubernetes.io/projected/d7e6493a-981c-4e06-ade3-77a34e3da785-kube-api-access-8n65z\") pod \"must-gather-tdtwd\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.191257 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e6493a-981c-4e06-ade3-77a34e3da785-must-gather-output\") pod \"must-gather-tdtwd\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.293228 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n65z\" (UniqueName: \"kubernetes.io/projected/d7e6493a-981c-4e06-ade3-77a34e3da785-kube-api-access-8n65z\") pod \"must-gather-tdtwd\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.293310 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e6493a-981c-4e06-ade3-77a34e3da785-must-gather-output\") pod \"must-gather-tdtwd\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.293749 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e6493a-981c-4e06-ade3-77a34e3da785-must-gather-output\") pod \"must-gather-tdtwd\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.313017 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n65z\" (UniqueName: \"kubernetes.io/projected/d7e6493a-981c-4e06-ade3-77a34e3da785-kube-api-access-8n65z\") pod \"must-gather-tdtwd\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.405763 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:48:14 crc kubenswrapper[4849]: W0320 13:48:14.835155 4849 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e6493a_981c_4e06_ade3_77a34e3da785.slice/crio-3043a00520856d8b33751f00c367c24ff3e3a899ad8c6b34af78d5fd70dfa61c WatchSource:0}: Error finding container 3043a00520856d8b33751f00c367c24ff3e3a899ad8c6b34af78d5fd70dfa61c: Status 404 returned error can't find the container with id 3043a00520856d8b33751f00c367c24ff3e3a899ad8c6b34af78d5fd70dfa61c Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.838052 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cj48j/must-gather-tdtwd"] Mar 20 13:48:14 crc kubenswrapper[4849]: I0320 13:48:14.894960 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/must-gather-tdtwd" event={"ID":"d7e6493a-981c-4e06-ade3-77a34e3da785","Type":"ContainerStarted","Data":"3043a00520856d8b33751f00c367c24ff3e3a899ad8c6b34af78d5fd70dfa61c"} Mar 20 13:48:19 crc kubenswrapper[4849]: I0320 13:48:19.938543 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/must-gather-tdtwd" event={"ID":"d7e6493a-981c-4e06-ade3-77a34e3da785","Type":"ContainerStarted","Data":"b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150"} Mar 20 13:48:19 crc kubenswrapper[4849]: I0320 13:48:19.939100 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/must-gather-tdtwd" event={"ID":"d7e6493a-981c-4e06-ade3-77a34e3da785","Type":"ContainerStarted","Data":"9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f"} Mar 20 13:48:19 crc kubenswrapper[4849]: I0320 13:48:19.957199 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cj48j/must-gather-tdtwd" podStartSLOduration=1.8229854479999998 podStartE2EDuration="5.957182632s" podCreationTimestamp="2026-03-20 13:48:14 +0000 UTC" firstStartedPulling="2026-03-20 13:48:14.837391621 +0000 UTC m=+1444.515115016" lastFinishedPulling="2026-03-20 13:48:18.971588804 +0000 UTC m=+1448.649312200" observedRunningTime="2026-03-20 13:48:19.951753449 +0000 UTC m=+1449.629476844" watchObservedRunningTime="2026-03-20 13:48:19.957182632 +0000 UTC m=+1449.634906027" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.272694 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cj48j/crc-debug-9z228"] Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.274632 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.276295 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cj48j"/"default-dockercfg-srj7r" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.422247 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/002f6ab8-4536-482f-9c26-b3195bcbb7c8-host\") pod \"crc-debug-9z228\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.422364 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4d8l\" (UniqueName: \"kubernetes.io/projected/002f6ab8-4536-482f-9c26-b3195bcbb7c8-kube-api-access-f4d8l\") pod \"crc-debug-9z228\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.524589 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/002f6ab8-4536-482f-9c26-b3195bcbb7c8-host\") pod \"crc-debug-9z228\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.524715 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4d8l\" (UniqueName: \"kubernetes.io/projected/002f6ab8-4536-482f-9c26-b3195bcbb7c8-kube-api-access-f4d8l\") pod \"crc-debug-9z228\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.525282 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/002f6ab8-4536-482f-9c26-b3195bcbb7c8-host\") pod \"crc-debug-9z228\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.544536 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4d8l\" (UniqueName: \"kubernetes.io/projected/002f6ab8-4536-482f-9c26-b3195bcbb7c8-kube-api-access-f4d8l\") pod \"crc-debug-9z228\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.610274 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:24 crc kubenswrapper[4849]: I0320 13:48:24.988031 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/crc-debug-9z228" event={"ID":"002f6ab8-4536-482f-9c26-b3195bcbb7c8","Type":"ContainerStarted","Data":"ab1fde2e0eaa3ff4d8b85188491cfaaf3da787a5c93b76cbc4c2c7f00f92c317"} Mar 20 13:48:37 crc kubenswrapper[4849]: I0320 13:48:37.112731 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/crc-debug-9z228" event={"ID":"002f6ab8-4536-482f-9c26-b3195bcbb7c8","Type":"ContainerStarted","Data":"98248c7bc50c98f50648d9dd478891ef225a2a5cd6cbb1d702ad7b6885f748e3"} Mar 20 13:48:37 crc kubenswrapper[4849]: I0320 13:48:37.155386 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cj48j/crc-debug-9z228" podStartSLOduration=1.8251607380000001 podStartE2EDuration="13.155344755s" podCreationTimestamp="2026-03-20 13:48:24 +0000 UTC" firstStartedPulling="2026-03-20 13:48:24.686257722 +0000 UTC m=+1454.363981117" lastFinishedPulling="2026-03-20 13:48:36.016441739 +0000 UTC m=+1465.694165134" observedRunningTime="2026-03-20 13:48:37.125217709 +0000 UTC m=+1466.802941124" watchObservedRunningTime="2026-03-20 13:48:37.155344755 +0000 UTC m=+1466.833068150" Mar 20 13:48:42 crc kubenswrapper[4849]: I0320 13:48:42.021607 4849 scope.go:117] "RemoveContainer" containerID="d9e371ebb37def4d04e95c4b419632fe0f70b889bee2d0cd5866c0702abd9deb" Mar 20 13:48:51 crc kubenswrapper[4849]: I0320 13:48:51.236708 4849 generic.go:334] "Generic (PLEG): container finished" podID="002f6ab8-4536-482f-9c26-b3195bcbb7c8" containerID="98248c7bc50c98f50648d9dd478891ef225a2a5cd6cbb1d702ad7b6885f748e3" exitCode=0 Mar 20 13:48:51 crc kubenswrapper[4849]: I0320 13:48:51.237129 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/crc-debug-9z228" event={"ID":"002f6ab8-4536-482f-9c26-b3195bcbb7c8","Type":"ContainerDied","Data":"98248c7bc50c98f50648d9dd478891ef225a2a5cd6cbb1d702ad7b6885f748e3"} Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.366936 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.421237 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cj48j/crc-debug-9z228"] Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.441331 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/002f6ab8-4536-482f-9c26-b3195bcbb7c8-host\") pod \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.441418 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4d8l\" (UniqueName: \"kubernetes.io/projected/002f6ab8-4536-482f-9c26-b3195bcbb7c8-kube-api-access-f4d8l\") pod \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\" (UID: \"002f6ab8-4536-482f-9c26-b3195bcbb7c8\") " Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.441493 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/002f6ab8-4536-482f-9c26-b3195bcbb7c8-host" (OuterVolumeSpecName: "host") pod "002f6ab8-4536-482f-9c26-b3195bcbb7c8" (UID: "002f6ab8-4536-482f-9c26-b3195bcbb7c8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.446251 4849 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/002f6ab8-4536-482f-9c26-b3195bcbb7c8-host\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.446921 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/002f6ab8-4536-482f-9c26-b3195bcbb7c8-kube-api-access-f4d8l" (OuterVolumeSpecName: "kube-api-access-f4d8l") pod "002f6ab8-4536-482f-9c26-b3195bcbb7c8" (UID: "002f6ab8-4536-482f-9c26-b3195bcbb7c8"). InnerVolumeSpecName "kube-api-access-f4d8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.447285 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cj48j/crc-debug-9z228"] Mar 20 13:48:52 crc kubenswrapper[4849]: I0320 13:48:52.548078 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4d8l\" (UniqueName: \"kubernetes.io/projected/002f6ab8-4536-482f-9c26-b3195bcbb7c8-kube-api-access-f4d8l\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.046397 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="002f6ab8-4536-482f-9c26-b3195bcbb7c8" path="/var/lib/kubelet/pods/002f6ab8-4536-482f-9c26-b3195bcbb7c8/volumes" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.254486 4849 scope.go:117] "RemoveContainer" containerID="98248c7bc50c98f50648d9dd478891ef225a2a5cd6cbb1d702ad7b6885f748e3" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.254528 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-9z228" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.611665 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cj48j/crc-debug-ggsb2"] Mar 20 13:48:53 crc kubenswrapper[4849]: E0320 13:48:53.612086 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="002f6ab8-4536-482f-9c26-b3195bcbb7c8" containerName="container-00" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.612100 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="002f6ab8-4536-482f-9c26-b3195bcbb7c8" containerName="container-00" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.612291 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="002f6ab8-4536-482f-9c26-b3195bcbb7c8" containerName="container-00" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.612913 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.615465 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cj48j"/"default-dockercfg-srj7r" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.666693 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxcfw\" (UniqueName: \"kubernetes.io/projected/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-kube-api-access-bxcfw\") pod \"crc-debug-ggsb2\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.667090 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-host\") pod \"crc-debug-ggsb2\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.769352 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-host\") pod \"crc-debug-ggsb2\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.769485 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-host\") pod \"crc-debug-ggsb2\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.769814 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxcfw\" (UniqueName: \"kubernetes.io/projected/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-kube-api-access-bxcfw\") pod \"crc-debug-ggsb2\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.787183 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxcfw\" (UniqueName: \"kubernetes.io/projected/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-kube-api-access-bxcfw\") pod \"crc-debug-ggsb2\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:53 crc kubenswrapper[4849]: I0320 13:48:53.928228 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:54 crc kubenswrapper[4849]: I0320 13:48:54.264238 4849 generic.go:334] "Generic (PLEG): container finished" podID="2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" containerID="a50ed2baf9a8762487b908a7fa0e47fbeaae849cae4adf0e549022639eec475a" exitCode=1 Mar 20 13:48:54 crc kubenswrapper[4849]: I0320 13:48:54.264346 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/crc-debug-ggsb2" event={"ID":"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a","Type":"ContainerDied","Data":"a50ed2baf9a8762487b908a7fa0e47fbeaae849cae4adf0e549022639eec475a"} Mar 20 13:48:54 crc kubenswrapper[4849]: I0320 13:48:54.264584 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/crc-debug-ggsb2" event={"ID":"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a","Type":"ContainerStarted","Data":"0c1e5d88d506f428ddc5df7e0631de812e01415e159cae4d57d4b3658a4d5a38"} Mar 20 13:48:54 crc kubenswrapper[4849]: I0320 13:48:54.302531 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cj48j/crc-debug-ggsb2"] Mar 20 13:48:54 crc kubenswrapper[4849]: I0320 13:48:54.312577 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cj48j/crc-debug-ggsb2"] Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.367988 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.399596 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-host\") pod \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.399683 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-host" (OuterVolumeSpecName: "host") pod "2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" (UID: "2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.400051 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxcfw\" (UniqueName: \"kubernetes.io/projected/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-kube-api-access-bxcfw\") pod \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\" (UID: \"2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a\") " Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.400631 4849 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-host\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.404904 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-kube-api-access-bxcfw" (OuterVolumeSpecName: "kube-api-access-bxcfw") pod "2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" (UID: "2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a"). InnerVolumeSpecName "kube-api-access-bxcfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4849]: I0320 13:48:55.502069 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxcfw\" (UniqueName: \"kubernetes.io/projected/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a-kube-api-access-bxcfw\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:56 crc kubenswrapper[4849]: I0320 13:48:56.282088 4849 scope.go:117] "RemoveContainer" containerID="a50ed2baf9a8762487b908a7fa0e47fbeaae849cae4adf0e549022639eec475a" Mar 20 13:48:56 crc kubenswrapper[4849]: I0320 13:48:56.282370 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/crc-debug-ggsb2" Mar 20 13:48:57 crc kubenswrapper[4849]: I0320 13:48:57.048575 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" path="/var/lib/kubelet/pods/2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a/volumes" Mar 20 13:49:29 crc kubenswrapper[4849]: I0320 13:49:29.904785 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68775b9c9d-w9j9w_cbe1119f-65b7-4aea-a636-1d745ea8e3b6/barbican-api/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.094486 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68775b9c9d-w9j9w_cbe1119f-65b7-4aea-a636-1d745ea8e3b6/barbican-api-log/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.109619 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-bxj2z_66983f57-dfbe-4c47-90f6-9eef82ebd9a1/mariadb-database-create/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.254850 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-gwq28_ee9399c2-4755-4acd-8514-7d49cdd92f16/barbican-db-sync/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.351304 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-e6b6-account-create-update-bxjkb_05bc515e-52ae-4e93-b967-a458d135ae12/mariadb-account-create-update/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.466370 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f759b9866-rl5dd_f30102de-0f18-4a4a-80e6-58d2de7c230d/barbican-keystone-listener/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.552575 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f759b9866-rl5dd_f30102de-0f18-4a4a-80e6-58d2de7c230d/barbican-keystone-listener-log/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.615935 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-845957dc9-clhc5_f29aa501-5db8-44ee-b155-a2ffe7b521bc/barbican-worker/0.log" Mar 20 13:49:30 crc kubenswrapper[4849]: I0320 13:49:30.882441 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-845957dc9-clhc5_f29aa501-5db8-44ee-b155-a2ffe7b521bc/barbican-worker-log/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.033791 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e6175e92-e13b-450c-82ca-c27cd658ba57/ceilometer-notification-agent/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.076915 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e6175e92-e13b-450c-82ca-c27cd658ba57/ceilometer-central-agent/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.081280 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e6175e92-e13b-450c-82ca-c27cd658ba57/proxy-httpd/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.110909 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_e6175e92-e13b-450c-82ca-c27cd658ba57/sg-core/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.246253 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-2589-account-create-update-q8rmz_6155d2c7-33f5-4bbb-b6a0-a378848a08e5/mariadb-account-create-update/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.344898 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c45c13e7-1cf2-4e2a-993b-83f1aa9428cb/cinder-api/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.450657 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c45c13e7-1cf2-4e2a-993b-83f1aa9428cb/cinder-api-log/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.535507 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-gqdxs_5b5fb05e-2f40-432f-acf5-068f32e62698/mariadb-database-create/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.642690 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-jk575_701dfbaa-ecac-4290-9402-90c866ccd108/cinder-db-sync/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.762167 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2ff8045e-a7ad-400e-9e02-cbc7dc3a248c/cinder-scheduler/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.857832 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_2ff8045e-a7ad-400e-9e02-cbc7dc3a248c/probe/0.log" Mar 20 13:49:31 crc kubenswrapper[4849]: I0320 13:49:31.941067 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-ns6mb_80896962-f9f0-4207-a772-be4cf354e8e6/init/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.123561 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-ns6mb_80896962-f9f0-4207-a772-be4cf354e8e6/init/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.162922 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-9d498_69beefe1-45de-469f-a3af-e42a88b38309/mariadb-database-create/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.187316 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-89c5cd4d5-ns6mb_80896962-f9f0-4207-a772-be4cf354e8e6/dnsmasq-dns/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.359350 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0245248e-3173-455f-9610-41be03d97ab1/glance-httpd/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.370580 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-66s8p_4baaa4a5-7434-40f0-bfee-185b7fc4fafb/glance-db-sync/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.516698 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0245248e-3173-455f-9610-41be03d97ab1/glance-log/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.636304 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99/glance-httpd/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.641373 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_9cbd391c-e3d3-4a9c-bcc8-ae02b3ed4e99/glance-log/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.742967 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-fdbf-account-create-update-2xkp5_b0a03006-5384-4542-8b30-dc8bea37c96a/mariadb-account-create-update/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.916776 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f784755c6-j267c_852cbb75-7003-4545-9b7b-b2eb83d269ac/horizon/0.log" Mar 20 13:49:32 crc kubenswrapper[4849]: I0320 13:49:32.977887 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7f784755c6-j267c_852cbb75-7003-4545-9b7b-b2eb83d269ac/horizon-log/0.log" Mar 20 13:49:33 crc kubenswrapper[4849]: I0320 13:49:33.100300 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-0a99-account-create-update-8k85v_481f5fb9-0040-4372-96b7-15e549dab23a/mariadb-account-create-update/0.log" Mar 20 13:49:33 crc kubenswrapper[4849]: I0320 13:49:33.231024 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7d5f885888-6vtg6_f4f4b109-3301-4c4d-9f8f-6fc3fe7b41e9/keystone-api/0.log" Mar 20 13:49:33 crc kubenswrapper[4849]: I0320 13:49:33.281700 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-6qksz_6f18d572-488f-4e4e-9596-3b99b5298123/keystone-bootstrap/0.log" Mar 20 13:49:33 crc kubenswrapper[4849]: I0320 13:49:33.443473 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-qft7p_c0fd8a46-86a6-403d-b740-ddd048bdc4b0/mariadb-database-create/0.log" Mar 20 13:49:33 crc kubenswrapper[4849]: I0320 13:49:33.479619 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-xvqnp_6129a249-c10a-4299-90c6-147c58b4926e/keystone-db-sync/0.log" Mar 20 13:49:33 crc kubenswrapper[4849]: I0320 13:49:33.860759 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_a36f6ced-ab2b-44a3-b7d6-c4744d7f959d/kube-state-metrics/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.094524 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-1a7e-account-create-update-lqd64_01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c/mariadb-account-create-update/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.208046 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-749b7fc4bf-nwzdf_8d4c5d75-d3c3-4035-852d-026e9183444c/neutron-api/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.286881 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-749b7fc4bf-nwzdf_8d4c5d75-d3c3-4035-852d-026e9183444c/neutron-httpd/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.454049 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-jdcns_3d1422a9-d5f8-4349-8513-0bd372fa8500/mariadb-database-create/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.490418 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-wh76b_c2a5cf24-7a8d-40f9-87cc-0b9b6533e520/neutron-db-sync/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.720636 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-653c-account-create-update-7p7tb_5336158a-f129-45cf-a73c-5e0733002023/mariadb-account-create-update/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.727638 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_30ed0b8d-41c8-4175-a998-6109bc3ec143/nova-api-api/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.779707 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_30ed0b8d-41c8-4175-a998-6109bc3ec143/nova-api-log/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.910777 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-hgx5j_fc762765-75e8-42df-bfd0-86cbad8172b3/mariadb-database-create/0.log" Mar 20 13:49:34 crc kubenswrapper[4849]: I0320 13:49:34.978376 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-5bc4-account-create-update-dggjx_84081e43-a4b1-4462-9b31-21d5d443d016/mariadb-account-create-update/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.138485 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-bxjmr_ce9903d0-8cc8-4bce-99da-96d1e8657e2a/nova-manage/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.406943 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_6d9f9502-8fe8-4a51-9891-29506dce2581/nova-cell0-conductor-conductor/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.410634 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-bsbt2_c100d127-fda4-4f86-89d7-64a19be3e8ea/nova-cell0-conductor-db-sync/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.601127 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-2kmcz_b1d86487-6a56-429a-a4af-afc82ed6a843/mariadb-database-create/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.676400 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-c417-account-create-update-x9vgq_d6fe63e8-731d-4c04-9679-25635974e8ce/mariadb-account-create-update/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.800388 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-qj5cd_e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a/nova-manage/0.log" Mar 20 13:49:35 crc kubenswrapper[4849]: I0320 13:49:35.952302 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4c1b8d9e-0e94-4c17-8f58-3b9f95c54d75/nova-cell1-conductor-conductor/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.022010 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-ccqkc_cfff4046-20be-4224-8bc7-0741b2fd01a7/nova-cell1-conductor-db-sync/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.102258 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-4lmj8_5fafae6e-b99e-4561-8caa-84a392b5e463/mariadb-database-create/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.281264 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_957dbf28-7479-44c7-96ed-787f99da4249/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.544185 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9396914f-9752-4915-ae6d-d2273935d774/nova-metadata-metadata/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.585761 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9396914f-9752-4915-ae6d-d2273935d774/nova-metadata-log/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.698107 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_4a506df4-da99-4034-91dc-ede44412cebf/nova-scheduler-scheduler/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.782767 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4f905722-c565-4fe5-bdde-0df02a23b833/mysql-bootstrap/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.974471 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4ef098b-892c-4619-a5b4-7c10cdf47f9b/mysql-bootstrap/0.log" Mar 20 13:49:36 crc kubenswrapper[4849]: I0320 13:49:36.975085 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4f905722-c565-4fe5-bdde-0df02a23b833/mysql-bootstrap/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.026129 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4f905722-c565-4fe5-bdde-0df02a23b833/galera/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.199805 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4ef098b-892c-4619-a5b4-7c10cdf47f9b/mysql-bootstrap/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.232541 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d6c971ba-2837-43ce-900d-c28d956ab162/openstackclient/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.332038 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b4ef098b-892c-4619-a5b4-7c10cdf47f9b/galera/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.464945 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9znfj_f589037a-06aa-452d-82ef-0dbf2177b7fc/ovn-controller/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.614935 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bgrqc_e5dac0d1-f9a7-4671-8fd5-5030df3fc592/openstack-network-exporter/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.733472 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-226bs_57363bb0-8542-49ea-95b9-84fd9206f644/ovsdb-server-init/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.914433 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-226bs_57363bb0-8542-49ea-95b9-84fd9206f644/ovsdb-server/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.914705 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-226bs_57363bb0-8542-49ea-95b9-84fd9206f644/ovsdb-server-init/0.log" Mar 20 13:49:37 crc kubenswrapper[4849]: I0320 13:49:37.929663 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-226bs_57363bb0-8542-49ea-95b9-84fd9206f644/ovs-vswitchd/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.137033 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fad9aa5d-a9a2-40b5-a51b-9ff1f934844f/openstack-network-exporter/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.149223 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fad9aa5d-a9a2-40b5-a51b-9ff1f934844f/ovn-northd/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.254531 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_91eeca7c-4c91-4b2f-8541-be7b6a36b582/openstack-network-exporter/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.358497 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9c5e6b3b-dc09-46d1-aac2-6625c28896fb/openstack-network-exporter/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.389636 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_91eeca7c-4c91-4b2f-8541-be7b6a36b582/ovsdbserver-nb/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.512995 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9c5e6b3b-dc09-46d1-aac2-6625c28896fb/ovsdbserver-sb/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.568014 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-3d75-account-create-update-6bd9j_0b28b324-7675-41b1-b1af-e37801c55af0/mariadb-account-create-update/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.745630 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f89c68c76-ng5km_717601ae-c668-47e5-8d74-a9cb9cf8e940/placement-api/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.770432 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f89c68c76-ng5km_717601ae-c668-47e5-8d74-a9cb9cf8e940/placement-log/0.log" Mar 20 13:49:38 crc kubenswrapper[4849]: I0320 13:49:38.879263 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-pjvhk_81b9c6f1-f1c5-4310-9c95-649b730470a5/mariadb-database-create/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.005070 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-xbwzw_39790e43-e227-4e13-8054-995e12255ec8/placement-db-sync/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.095698 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c3c4952-4c22-4389-834c-969b89fb9e20/setup-container/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.384052 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.384121 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.482662 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c3c4952-4c22-4389-834c-969b89fb9e20/setup-container/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.517859 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3c3c4952-4c22-4389-834c-969b89fb9e20/rabbitmq/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.559728 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_464306bd-0d8b-40ca-aa64-1ec5a00a527b/setup-container/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.736778 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_464306bd-0d8b-40ca-aa64-1ec5a00a527b/setup-container/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.753073 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_464306bd-0d8b-40ca-aa64-1ec5a00a527b/rabbitmq/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.802221 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-k72fr_ef32d779-a195-46ae-9112-3ecdbfe73a1e/mariadb-account-create-update/0.log" Mar 20 13:49:39 crc kubenswrapper[4849]: I0320 13:49:39.993254 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-74f54cb475-dnvws_ccb7bb85-a9e1-4de8-83b1-081dd02455b9/proxy-server/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.027891 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-74f54cb475-dnvws_ccb7bb85-a9e1-4de8-83b1-081dd02455b9/proxy-httpd/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.141921 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-pmrvl_07ad4563-bfe9-462b-8191-f21c950281df/swift-ring-rebalance/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.270733 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/account-auditor/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.275540 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/account-reaper/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.370045 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/account-replicator/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.405743 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/account-server/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.501677 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/container-auditor/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.546785 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/container-replicator/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.554650 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/container-server/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.659917 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/container-updater/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.698422 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/object-auditor/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.766591 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/object-replicator/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.791989 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/object-expirer/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.877221 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/object-server/0.log" Mar 20 13:49:40 crc kubenswrapper[4849]: I0320 13:49:40.906066 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/object-updater/0.log" Mar 20 13:49:41 crc kubenswrapper[4849]: I0320 13:49:41.022330 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/rsync/0.log" Mar 20 13:49:41 crc kubenswrapper[4849]: I0320 13:49:41.094439 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_189da4ab-90d9-4761-b94e-77f30a025385/swift-recon-cron/0.log" Mar 20 13:49:41 crc kubenswrapper[4849]: I0320 13:49:41.912229 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_200bac0a-008a-4528-bb22-3cf6e1ef6342/memcached/0.log" Mar 20 13:49:42 crc kubenswrapper[4849]: I0320 13:49:42.828360 4849 scope.go:117] "RemoveContainer" containerID="e9c29193f3b4ae7ba05fc46ceb0ce59a15cc3082f4b41e5226357a6f7dbc60a9" Mar 20 13:49:42 crc kubenswrapper[4849]: I0320 13:49:42.855904 4849 scope.go:117] "RemoveContainer" containerID="628e9a3aec8dbfa2c7fee28dd556fdcd30e193a34558b1ab9cab5c176af2dfb0" Mar 20 13:49:42 crc kubenswrapper[4849]: I0320 13:49:42.924889 4849 scope.go:117] "RemoveContainer" containerID="41a65b89ba16e1797ddad54421f40dcfa1aa9bcc02eb9ea68e86610033152b85" Mar 20 13:49:42 crc kubenswrapper[4849]: I0320 13:49:42.960940 4849 scope.go:117] "RemoveContainer" containerID="366ba86dbcac033ff4daf8784d2cc247035000df23e6aff3f237b22d6ac1b19d" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.148550 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566910-5f9s4"] Mar 20 13:50:00 crc kubenswrapper[4849]: E0320 13:50:00.149709 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" containerName="container-00" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.149732 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" containerName="container-00" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.150031 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee948de-adf1-4cfa-a7ba-8756a5a5cf1a" containerName="container-00" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.151068 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.153151 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.153208 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.155049 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.159005 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-5f9s4"] Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.209051 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjkc\" (UniqueName: \"kubernetes.io/projected/7dbd39df-02d8-4bc2-8953-9618afa3138d-kube-api-access-vnjkc\") pod \"auto-csr-approver-29566910-5f9s4\" (UID: \"7dbd39df-02d8-4bc2-8953-9618afa3138d\") " pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.310432 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjkc\" (UniqueName: \"kubernetes.io/projected/7dbd39df-02d8-4bc2-8953-9618afa3138d-kube-api-access-vnjkc\") pod \"auto-csr-approver-29566910-5f9s4\" (UID: \"7dbd39df-02d8-4bc2-8953-9618afa3138d\") " pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.334072 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjkc\" (UniqueName: \"kubernetes.io/projected/7dbd39df-02d8-4bc2-8953-9618afa3138d-kube-api-access-vnjkc\") pod \"auto-csr-approver-29566910-5f9s4\" (UID: \"7dbd39df-02d8-4bc2-8953-9618afa3138d\") " pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:00 crc kubenswrapper[4849]: I0320 13:50:00.481909 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:01 crc kubenswrapper[4849]: I0320 13:50:01.077024 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-5f9s4"] Mar 20 13:50:01 crc kubenswrapper[4849]: I0320 13:50:01.087761 4849 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:50:01 crc kubenswrapper[4849]: I0320 13:50:01.887511 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" event={"ID":"7dbd39df-02d8-4bc2-8953-9618afa3138d","Type":"ContainerStarted","Data":"a668c7ef49a55d53bfd9a158184d44e12cbbc2d37c803ed88d39b13d0e13c413"} Mar 20 13:50:02 crc kubenswrapper[4849]: I0320 13:50:02.898229 4849 generic.go:334] "Generic (PLEG): container finished" podID="7dbd39df-02d8-4bc2-8953-9618afa3138d" containerID="c6da98ff43e0f1a5e583a16af2d1aec0ab25b16b45b2ad6de2f667d368e756f3" exitCode=0 Mar 20 13:50:02 crc kubenswrapper[4849]: I0320 13:50:02.898284 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" event={"ID":"7dbd39df-02d8-4bc2-8953-9618afa3138d","Type":"ContainerDied","Data":"c6da98ff43e0f1a5e583a16af2d1aec0ab25b16b45b2ad6de2f667d368e756f3"} Mar 20 13:50:02 crc kubenswrapper[4849]: I0320 13:50:02.931533 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/util/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.056799 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/pull/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.064213 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/util/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.160122 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/pull/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.371756 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/util/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.386065 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/extract/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.386429 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c84pmgg4_b3bf2e9e-5405-4c01-977d-0ad6960a13e9/pull/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.649983 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-mdm4l_485ab391-8811-4d32-a7ce-de1f2c0cd1e5/manager/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.794125 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-spl5d_377c37d3-9285-44fb-bcd7-1dba905a3133/manager/0.log" Mar 20 13:50:03 crc kubenswrapper[4849]: I0320 13:50:03.989262 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-gw4br_89f24131-b326-437f-8d55-ccc77b120d8a/manager/0.log" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.140772 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-bzttm_956cd6f9-4828-4304-9b85-12025b56b9d5/manager/0.log" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.264092 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.313884 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnjkc\" (UniqueName: \"kubernetes.io/projected/7dbd39df-02d8-4bc2-8953-9618afa3138d-kube-api-access-vnjkc\") pod \"7dbd39df-02d8-4bc2-8953-9618afa3138d\" (UID: \"7dbd39df-02d8-4bc2-8953-9618afa3138d\") " Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.331112 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-jvhjd_bb530eb5-4963-4790-89f6-e21f33d2b254/manager/0.log" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.356000 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dbd39df-02d8-4bc2-8953-9618afa3138d-kube-api-access-vnjkc" (OuterVolumeSpecName: "kube-api-access-vnjkc") pod "7dbd39df-02d8-4bc2-8953-9618afa3138d" (UID: "7dbd39df-02d8-4bc2-8953-9618afa3138d"). InnerVolumeSpecName "kube-api-access-vnjkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.421350 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnjkc\" (UniqueName: \"kubernetes.io/projected/7dbd39df-02d8-4bc2-8953-9618afa3138d-kube-api-access-vnjkc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.495148 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-55knt_b186179d-3d3c-4cd1-806b-d7d8682ac88f/manager/0.log" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.918977 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" event={"ID":"7dbd39df-02d8-4bc2-8953-9618afa3138d","Type":"ContainerDied","Data":"a668c7ef49a55d53bfd9a158184d44e12cbbc2d37c803ed88d39b13d0e13c413"} Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.919357 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a668c7ef49a55d53bfd9a158184d44e12cbbc2d37c803ed88d39b13d0e13c413" Mar 20 13:50:04 crc kubenswrapper[4849]: I0320 13:50:04.919030 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-5f9s4" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.091381 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-8thmf_5345f1a2-c4af-46ca-b53a-acbb0cbcec04/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.176303 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-669fff9c7c-v8lc5_03c23473-4b4c-4e24-92f6-69363a9cf363/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.322502 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-vvscl_94697f17-6007-4cd9-9eb6-04832d0e94c6/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.340969 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-gd8zb"] Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.349434 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-gd8zb"] Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.354654 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-m9dm8_eb969620-248a-4a3d-9377-e61dd62a263a/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.529171 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-drbhr_3fdffb87-786f-4c4e-88fd-b1dd7bcf728d/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.624419 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-m6v2j_fe62d445-815f-4606-8cca-aa13f732c509/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.778000 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-z2xd8_f95c179b-0dc5-4cea-98e7-7df754c3c0e2/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.793884 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-m4gkc_d8b2fadb-ac5a-4883-8f52-059f659844fb/manager/0.log" Mar 20 13:50:05 crc kubenswrapper[4849]: I0320 13:50:05.974286 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5zg2mh_4996a7a1-3666-436e-b366-7f32c73cee02/manager/0.log" Mar 20 13:50:06 crc kubenswrapper[4849]: I0320 13:50:06.123611 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-59b5998766-k4qxz_3a778a1c-47e6-4f37-8bad-08edb0324503/operator/0.log" Mar 20 13:50:06 crc kubenswrapper[4849]: I0320 13:50:06.411080 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ntf6j_eb632334-854a-446f-8964-4bc5812a7638/registry-server/0.log" Mar 20 13:50:06 crc kubenswrapper[4849]: I0320 13:50:06.558198 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-b78d5_75598a7a-c554-416a-833a-5e2f1a40966e/manager/0.log" Mar 20 13:50:06 crc kubenswrapper[4849]: I0320 13:50:06.653570 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-d8d62_001e060b-cc07-4327-a02f-2a8a9c593aa3/manager/0.log" Mar 20 13:50:06 crc kubenswrapper[4849]: I0320 13:50:06.788324 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-6s2qr_c84791cf-2ae6-4edd-b8b4-449995825ee7/manager/0.log" Mar 20 13:50:06 crc kubenswrapper[4849]: I0320 13:50:06.977493 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85d5885774-jhdbx_7c9c4158-3ca1-4c9f-8fef-43a35bbff88b/manager/0.log" Mar 20 13:50:07 crc kubenswrapper[4849]: I0320 13:50:07.027080 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-frmv4_35796358-732f-4ec4-88e0-f121b509a14c/manager/0.log" Mar 20 13:50:07 crc kubenswrapper[4849]: I0320 13:50:07.048503 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bec97e6-a0d7-484e-ba32-9469c22ff871" path="/var/lib/kubelet/pods/3bec97e6-a0d7-484e-ba32-9469c22ff871/volumes" Mar 20 13:50:07 crc kubenswrapper[4849]: I0320 13:50:07.055413 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-kfh92_5930666f-c065-44ca-a66c-42d75ef8a0ef/manager/0.log" Mar 20 13:50:07 crc kubenswrapper[4849]: I0320 13:50:07.234524 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-rlm4l_0dfb9834-d621-4ae6-aedf-d9135d4e22cd/manager/0.log" Mar 20 13:50:09 crc kubenswrapper[4849]: I0320 13:50:09.384518 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:50:09 crc kubenswrapper[4849]: I0320 13:50:09.384810 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:50:24 crc kubenswrapper[4849]: I0320 13:50:24.796028 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hv7bs_7ec0bd3b-8a73-446d-8fb6-53c537db79f0/control-plane-machine-set-operator/0.log" Mar 20 13:50:25 crc kubenswrapper[4849]: I0320 13:50:25.010967 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tb9sj_db727f58-5ed2-4e4f-88d1-5df962353c84/kube-rbac-proxy/0.log" Mar 20 13:50:25 crc kubenswrapper[4849]: I0320 13:50:25.055579 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tb9sj_db727f58-5ed2-4e4f-88d1-5df962353c84/machine-api-operator/0.log" Mar 20 13:50:36 crc kubenswrapper[4849]: I0320 13:50:36.584130 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-z5lkb_d73cc10b-a789-4f78-8f7e-23e5fef49ae5/cert-manager-controller/0.log" Mar 20 13:50:36 crc kubenswrapper[4849]: I0320 13:50:36.750218 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-x9nbb_602a29bd-b8ec-4538-bdfe-ae7a2bd7149c/cert-manager-cainjector/0.log" Mar 20 13:50:36 crc kubenswrapper[4849]: I0320 13:50:36.812469 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-pgjlm_f65771ac-fff2-4237-b3de-bff20fdda5d1/cert-manager-webhook/0.log" Mar 20 13:50:39 crc kubenswrapper[4849]: I0320 13:50:39.384752 4849 patch_prober.go:28] interesting pod/machine-config-daemon-2pzdl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:50:39 crc kubenswrapper[4849]: I0320 13:50:39.385108 4849 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:50:39 crc kubenswrapper[4849]: I0320 13:50:39.385155 4849 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" Mar 20 13:50:39 crc kubenswrapper[4849]: I0320 13:50:39.385986 4849 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0"} pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:50:39 crc kubenswrapper[4849]: I0320 13:50:39.386059 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerName="machine-config-daemon" containerID="cri-o://068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" gracePeriod=600 Mar 20 13:50:39 crc kubenswrapper[4849]: E0320 13:50:39.512603 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:50:40 crc kubenswrapper[4849]: I0320 13:50:40.219449 4849 generic.go:334] "Generic (PLEG): container finished" podID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" exitCode=0 Mar 20 13:50:40 crc kubenswrapper[4849]: I0320 13:50:40.219810 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerDied","Data":"068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0"} Mar 20 13:50:40 crc kubenswrapper[4849]: I0320 13:50:40.219881 4849 scope.go:117] "RemoveContainer" containerID="7a773bf9c49237a354678cd3d44df741a149b88bc3cf8989ae80a8e48fb75b7b" Mar 20 13:50:40 crc kubenswrapper[4849]: I0320 13:50:40.220668 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:50:40 crc kubenswrapper[4849]: E0320 13:50:40.221081 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:50:43 crc kubenswrapper[4849]: I0320 13:50:43.117609 4849 scope.go:117] "RemoveContainer" containerID="0d89e398b4272f12dbec65de3b811005b6c969c4cdcee69ddc8c1ee9f61cccd5" Mar 20 13:50:43 crc kubenswrapper[4849]: I0320 13:50:43.136698 4849 scope.go:117] "RemoveContainer" containerID="a0eb1ae683075f418c02af4c3250e8b93290d3236fe805f6c5e6fdbee5c5a4c9" Mar 20 13:50:43 crc kubenswrapper[4849]: I0320 13:50:43.159369 4849 scope.go:117] "RemoveContainer" containerID="a19c96162d1f1d6aa4932ff050024bf930cd0e515ebb07895568780b8b23af2c" Mar 20 13:50:43 crc kubenswrapper[4849]: I0320 13:50:43.252212 4849 scope.go:117] "RemoveContainer" containerID="881216ee4bf96cb670a317b4944ff7febf607a77c4cbc839160bcb2d730ff126" Mar 20 13:50:48 crc kubenswrapper[4849]: I0320 13:50:48.018868 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-dlg5s_95c4282d-b3de-46dc-9806-c6cc81e3bde2/nmstate-console-plugin/0.log" Mar 20 13:50:48 crc kubenswrapper[4849]: I0320 13:50:48.191584 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-p5xfw_b059c698-6411-477d-b7de-3da2b096a013/nmstate-handler/0.log" Mar 20 13:50:48 crc kubenswrapper[4849]: I0320 13:50:48.200491 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-h6gpr_2c219def-f597-4e67-87d7-61844f8980e0/kube-rbac-proxy/0.log" Mar 20 13:50:48 crc kubenswrapper[4849]: I0320 13:50:48.249678 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-h6gpr_2c219def-f597-4e67-87d7-61844f8980e0/nmstate-metrics/0.log" Mar 20 13:50:48 crc kubenswrapper[4849]: I0320 13:50:48.431312 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-jnp5v_8dad839f-5780-4ace-a89d-b4e79b51f2be/nmstate-webhook/0.log" Mar 20 13:50:48 crc kubenswrapper[4849]: I0320 13:50:48.937043 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-f45bq_08e1d2e6-28d6-4dcf-bf0f-ea6c92abc7db/nmstate-operator/0.log" Mar 20 13:50:55 crc kubenswrapper[4849]: I0320 13:50:55.035588 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:50:55 crc kubenswrapper[4849]: E0320 13:50:55.036469 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:51:08 crc kubenswrapper[4849]: I0320 13:51:08.036202 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:51:08 crc kubenswrapper[4849]: E0320 13:51:08.037227 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.298295 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-rxmwb_de1d165d-e05b-4ae2-b059-d8635faa0323/kube-rbac-proxy/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.452004 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-rxmwb_de1d165d-e05b-4ae2-b059-d8635faa0323/controller/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.547115 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-frr-files/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.667863 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-reloader/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.720414 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-frr-files/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.726630 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-metrics/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.789830 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-reloader/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.924520 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-frr-files/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.930756 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-reloader/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.951338 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-metrics/0.log" Mar 20 13:51:13 crc kubenswrapper[4849]: I0320 13:51:13.987572 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-metrics/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.148532 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-reloader/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.157443 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-frr-files/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.187536 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/controller/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.199515 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/cp-metrics/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.336430 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/frr-metrics/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.352476 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/kube-rbac-proxy/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.413730 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/kube-rbac-proxy-frr/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.531832 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/reloader/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.586264 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-kbdhz_61f44068-2b56-4919-83ac-1e6c43aaf840/frr-k8s-webhook-server/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.790631 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7fcdcd599c-wtkkp_ba371ab1-5402-4921-b341-f54033be1fca/manager/0.log" Mar 20 13:51:14 crc kubenswrapper[4849]: I0320 13:51:14.963346 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74dc9df6c8-4p87h_5a336306-f1db-4c3d-ab2e-30c344b03195/webhook-server/0.log" Mar 20 13:51:15 crc kubenswrapper[4849]: I0320 13:51:15.030219 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zmzbp_9861c394-1567-4eeb-b487-979a4b725630/kube-rbac-proxy/0.log" Mar 20 13:51:15 crc kubenswrapper[4849]: I0320 13:51:15.198166 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9k4zx_82866517-1e14-49c5-81be-88da5e861369/frr/0.log" Mar 20 13:51:15 crc kubenswrapper[4849]: I0320 13:51:15.428676 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zmzbp_9861c394-1567-4eeb-b487-979a4b725630/speaker/0.log" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.767491 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c4c7n"] Mar 20 13:51:18 crc kubenswrapper[4849]: E0320 13:51:18.769545 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbd39df-02d8-4bc2-8953-9618afa3138d" containerName="oc" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.769653 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbd39df-02d8-4bc2-8953-9618afa3138d" containerName="oc" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.769989 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dbd39df-02d8-4bc2-8953-9618afa3138d" containerName="oc" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.771701 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.785900 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4c7n"] Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.851207 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-catalog-content\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.851262 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-utilities\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.851619 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxtg\" (UniqueName: \"kubernetes.io/projected/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-kube-api-access-vqxtg\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.953719 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxtg\" (UniqueName: \"kubernetes.io/projected/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-kube-api-access-vqxtg\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.954110 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-catalog-content\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.954233 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-utilities\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.954644 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-catalog-content\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.954659 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-utilities\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:18 crc kubenswrapper[4849]: I0320 13:51:18.974247 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxtg\" (UniqueName: \"kubernetes.io/projected/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-kube-api-access-vqxtg\") pod \"community-operators-c4c7n\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:19 crc kubenswrapper[4849]: I0320 13:51:19.103466 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:19 crc kubenswrapper[4849]: I0320 13:51:19.634750 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4c7n"] Mar 20 13:51:20 crc kubenswrapper[4849]: I0320 13:51:20.552801 4849 generic.go:334] "Generic (PLEG): container finished" podID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerID="9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea" exitCode=0 Mar 20 13:51:20 crc kubenswrapper[4849]: I0320 13:51:20.552895 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4c7n" event={"ID":"27c00e76-1e86-47a8-b15c-6ec48e7b91f4","Type":"ContainerDied","Data":"9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea"} Mar 20 13:51:20 crc kubenswrapper[4849]: I0320 13:51:20.553122 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4c7n" event={"ID":"27c00e76-1e86-47a8-b15c-6ec48e7b91f4","Type":"ContainerStarted","Data":"5ae459456266703affcd616f23da5ba423fa9ddf07964a6bb1939f56dfaeb941"} Mar 20 13:51:22 crc kubenswrapper[4849]: I0320 13:51:22.036374 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:51:22 crc kubenswrapper[4849]: E0320 13:51:22.037208 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:51:22 crc kubenswrapper[4849]: I0320 13:51:22.574880 4849 generic.go:334] "Generic (PLEG): container finished" podID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerID="7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb" exitCode=0 Mar 20 13:51:22 crc kubenswrapper[4849]: I0320 13:51:22.574918 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4c7n" event={"ID":"27c00e76-1e86-47a8-b15c-6ec48e7b91f4","Type":"ContainerDied","Data":"7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb"} Mar 20 13:51:23 crc kubenswrapper[4849]: I0320 13:51:23.584883 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4c7n" event={"ID":"27c00e76-1e86-47a8-b15c-6ec48e7b91f4","Type":"ContainerStarted","Data":"8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f"} Mar 20 13:51:27 crc kubenswrapper[4849]: I0320 13:51:27.777880 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/util/0.log" Mar 20 13:51:27 crc kubenswrapper[4849]: I0320 13:51:27.995469 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/pull/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.022622 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/util/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.063685 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/pull/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.243300 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/util/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.245093 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/pull/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.261234 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874m8klg_b39cc5fb-85b5-407a-b4ca-7b674ae7039d/extract/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.401620 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/util/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.648265 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/util/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.651381 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/pull/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.680833 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/pull/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.830127 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/util/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.850104 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/pull/0.log" Mar 20 13:51:28 crc kubenswrapper[4849]: I0320 13:51:28.886433 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1vq9k2_e50fa65a-4209-4a3e-8626-278c3920e206/extract/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.023231 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/extract-utilities/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.104538 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.104586 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.164420 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.178412 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/extract-content/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.180697 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c4c7n" podStartSLOduration=8.733241351 podStartE2EDuration="11.180682071s" podCreationTimestamp="2026-03-20 13:51:18 +0000 UTC" firstStartedPulling="2026-03-20 13:51:20.555587664 +0000 UTC m=+1630.233311059" lastFinishedPulling="2026-03-20 13:51:23.003028384 +0000 UTC m=+1632.680751779" observedRunningTime="2026-03-20 13:51:23.605474187 +0000 UTC m=+1633.283197582" watchObservedRunningTime="2026-03-20 13:51:29.180682071 +0000 UTC m=+1638.858405466" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.209034 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/extract-content/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.216902 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/extract-utilities/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.401600 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/extract-content/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.437632 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/extract-utilities/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.632044 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zvvvr_036f1af8-83ae-4a96-b192-8349a3f78e16/registry-server/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.652210 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/extract-utilities/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.704617 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.755205 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4c7n"] Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.805519 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/extract-content/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.812297 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/extract-content/0.log" Mar 20 13:51:29 crc kubenswrapper[4849]: I0320 13:51:29.812344 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/extract-utilities/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.010263 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/extract-utilities/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.020620 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/extract-content/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.025062 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-c4c7n_27c00e76-1e86-47a8-b15c-6ec48e7b91f4/registry-server/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.168032 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/extract-utilities/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.336002 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/extract-content/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.336423 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/extract-content/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.348446 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/extract-utilities/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.541272 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/extract-content/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.554045 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/extract-utilities/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.798694 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-fhpw4_ead5a591-d201-4f88-8357-d2c8d3ceb93e/marketplace-operator/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.843985 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hkn9l_dfbb658a-6c29-4a47-be7c-37aaade8f494/registry-server/0.log" Mar 20 13:51:30 crc kubenswrapper[4849]: I0320 13:51:30.878193 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/extract-utilities/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.011891 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/extract-content/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.050301 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/extract-utilities/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.070805 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/extract-content/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.184431 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/extract-utilities/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.215918 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/extract-content/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.313907 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h596s_00f0000c-6337-492e-928d-047fcbf4dfc5/registry-server/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.375708 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/extract-utilities/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.552234 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/extract-content/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.564673 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/extract-utilities/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.573459 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/extract-content/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.666559 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c4c7n" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="registry-server" containerID="cri-o://8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f" gracePeriod=2 Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.744776 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/extract-utilities/0.log" Mar 20 13:51:31 crc kubenswrapper[4849]: I0320 13:51:31.831415 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/extract-content/0.log" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.019059 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865zh_95576495-a434-415a-98c3-714268b1d0c1/registry-server/0.log" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.313167 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.409888 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-catalog-content\") pod \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.410078 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-utilities\") pod \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.410150 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqxtg\" (UniqueName: \"kubernetes.io/projected/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-kube-api-access-vqxtg\") pod \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\" (UID: \"27c00e76-1e86-47a8-b15c-6ec48e7b91f4\") " Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.411776 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-utilities" (OuterVolumeSpecName: "utilities") pod "27c00e76-1e86-47a8-b15c-6ec48e7b91f4" (UID: "27c00e76-1e86-47a8-b15c-6ec48e7b91f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.415740 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-kube-api-access-vqxtg" (OuterVolumeSpecName: "kube-api-access-vqxtg") pod "27c00e76-1e86-47a8-b15c-6ec48e7b91f4" (UID: "27c00e76-1e86-47a8-b15c-6ec48e7b91f4"). InnerVolumeSpecName "kube-api-access-vqxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.463712 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27c00e76-1e86-47a8-b15c-6ec48e7b91f4" (UID: "27c00e76-1e86-47a8-b15c-6ec48e7b91f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.512150 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.512395 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.512475 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqxtg\" (UniqueName: \"kubernetes.io/projected/27c00e76-1e86-47a8-b15c-6ec48e7b91f4-kube-api-access-vqxtg\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.680202 4849 generic.go:334] "Generic (PLEG): container finished" podID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerID="8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f" exitCode=0 Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.680302 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4c7n" event={"ID":"27c00e76-1e86-47a8-b15c-6ec48e7b91f4","Type":"ContainerDied","Data":"8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f"} Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.680335 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4c7n" event={"ID":"27c00e76-1e86-47a8-b15c-6ec48e7b91f4","Type":"ContainerDied","Data":"5ae459456266703affcd616f23da5ba423fa9ddf07964a6bb1939f56dfaeb941"} Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.680356 4849 scope.go:117] "RemoveContainer" containerID="8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.680514 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4c7n" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.708660 4849 scope.go:117] "RemoveContainer" containerID="7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.735355 4849 scope.go:117] "RemoveContainer" containerID="9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.746617 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4c7n"] Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.754521 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c4c7n"] Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.773486 4849 scope.go:117] "RemoveContainer" containerID="8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f" Mar 20 13:51:32 crc kubenswrapper[4849]: E0320 13:51:32.774015 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f\": container with ID starting with 8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f not found: ID does not exist" containerID="8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.774055 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f"} err="failed to get container status \"8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f\": rpc error: code = NotFound desc = could not find container \"8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f\": container with ID starting with 8208a9cce94686f9abb886fcc7d81a7f9fbbf49ae708657c00d4fa1da64b152f not found: ID does not exist" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.774083 4849 scope.go:117] "RemoveContainer" containerID="7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb" Mar 20 13:51:32 crc kubenswrapper[4849]: E0320 13:51:32.774505 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb\": container with ID starting with 7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb not found: ID does not exist" containerID="7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.774533 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb"} err="failed to get container status \"7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb\": rpc error: code = NotFound desc = could not find container \"7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb\": container with ID starting with 7cb72381631e9ff122252a64567be9e921e1ea92dc85c9be5eb72d2c27ae8afb not found: ID does not exist" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.774567 4849 scope.go:117] "RemoveContainer" containerID="9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea" Mar 20 13:51:32 crc kubenswrapper[4849]: E0320 13:51:32.774935 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea\": container with ID starting with 9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea not found: ID does not exist" containerID="9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea" Mar 20 13:51:32 crc kubenswrapper[4849]: I0320 13:51:32.775035 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea"} err="failed to get container status \"9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea\": rpc error: code = NotFound desc = could not find container \"9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea\": container with ID starting with 9d61fabe4288cefc9b3409ba8623e61e54e2c80723c2ffd0f0f94545fcc8c3ea not found: ID does not exist" Mar 20 13:51:33 crc kubenswrapper[4849]: I0320 13:51:33.053749 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" path="/var/lib/kubelet/pods/27c00e76-1e86-47a8-b15c-6ec48e7b91f4/volumes" Mar 20 13:51:35 crc kubenswrapper[4849]: I0320 13:51:35.035991 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:51:35 crc kubenswrapper[4849]: E0320 13:51:35.036704 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:51:43 crc kubenswrapper[4849]: I0320 13:51:43.346017 4849 scope.go:117] "RemoveContainer" containerID="47ad10e192eab044a506b16f61b3d058fe52c5d53ead1431411be3904fe2b71f" Mar 20 13:51:43 crc kubenswrapper[4849]: I0320 13:51:43.380438 4849 scope.go:117] "RemoveContainer" containerID="76caa296c758a2d6f13892e5c663cd03a8abba4518249fa55ad4c3ca8bf083a5" Mar 20 13:51:50 crc kubenswrapper[4849]: I0320 13:51:50.036414 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:51:50 crc kubenswrapper[4849]: E0320 13:51:50.037172 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.147399 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566912-t8hbm"] Mar 20 13:52:00 crc kubenswrapper[4849]: E0320 13:52:00.148209 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="extract-content" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.148223 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="extract-content" Mar 20 13:52:00 crc kubenswrapper[4849]: E0320 13:52:00.148236 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="extract-utilities" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.148242 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="extract-utilities" Mar 20 13:52:00 crc kubenswrapper[4849]: E0320 13:52:00.148269 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="registry-server" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.148275 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="registry-server" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.148435 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="27c00e76-1e86-47a8-b15c-6ec48e7b91f4" containerName="registry-server" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.149032 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.152297 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.153012 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.153246 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.159441 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-t8hbm"] Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.313925 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlpv\" (UniqueName: \"kubernetes.io/projected/f93f39d2-bac3-40a4-ab15-5b45b19965ac-kube-api-access-jjlpv\") pod \"auto-csr-approver-29566912-t8hbm\" (UID: \"f93f39d2-bac3-40a4-ab15-5b45b19965ac\") " pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.416002 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlpv\" (UniqueName: \"kubernetes.io/projected/f93f39d2-bac3-40a4-ab15-5b45b19965ac-kube-api-access-jjlpv\") pod \"auto-csr-approver-29566912-t8hbm\" (UID: \"f93f39d2-bac3-40a4-ab15-5b45b19965ac\") " pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.435711 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlpv\" (UniqueName: \"kubernetes.io/projected/f93f39d2-bac3-40a4-ab15-5b45b19965ac-kube-api-access-jjlpv\") pod \"auto-csr-approver-29566912-t8hbm\" (UID: \"f93f39d2-bac3-40a4-ab15-5b45b19965ac\") " pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:00 crc kubenswrapper[4849]: I0320 13:52:00.472301 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:01 crc kubenswrapper[4849]: I0320 13:52:01.031562 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-t8hbm"] Mar 20 13:52:01 crc kubenswrapper[4849]: I0320 13:52:01.910340 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" event={"ID":"f93f39d2-bac3-40a4-ab15-5b45b19965ac","Type":"ContainerStarted","Data":"8819e5cc27e5b83f737e4c10d5fed208803736c9de1b6298fb3b5043044d440b"} Mar 20 13:52:03 crc kubenswrapper[4849]: I0320 13:52:03.926121 4849 generic.go:334] "Generic (PLEG): container finished" podID="f93f39d2-bac3-40a4-ab15-5b45b19965ac" containerID="6aab40d6e9adc4706803181119150adc91d73cb47310482e084179280c46bc91" exitCode=0 Mar 20 13:52:03 crc kubenswrapper[4849]: I0320 13:52:03.926201 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" event={"ID":"f93f39d2-bac3-40a4-ab15-5b45b19965ac","Type":"ContainerDied","Data":"6aab40d6e9adc4706803181119150adc91d73cb47310482e084179280c46bc91"} Mar 20 13:52:04 crc kubenswrapper[4849]: E0320 13:52:04.505639 4849 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.194:53086->38.102.83.194:37345: write tcp 38.102.83.194:53086->38.102.83.194:37345: write: broken pipe Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.037005 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:52:05 crc kubenswrapper[4849]: E0320 13:52:05.038625 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.278978 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.403260 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjlpv\" (UniqueName: \"kubernetes.io/projected/f93f39d2-bac3-40a4-ab15-5b45b19965ac-kube-api-access-jjlpv\") pod \"f93f39d2-bac3-40a4-ab15-5b45b19965ac\" (UID: \"f93f39d2-bac3-40a4-ab15-5b45b19965ac\") " Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.410380 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f39d2-bac3-40a4-ab15-5b45b19965ac-kube-api-access-jjlpv" (OuterVolumeSpecName: "kube-api-access-jjlpv") pod "f93f39d2-bac3-40a4-ab15-5b45b19965ac" (UID: "f93f39d2-bac3-40a4-ab15-5b45b19965ac"). InnerVolumeSpecName "kube-api-access-jjlpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.505694 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjlpv\" (UniqueName: \"kubernetes.io/projected/f93f39d2-bac3-40a4-ab15-5b45b19965ac-kube-api-access-jjlpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.955889 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" event={"ID":"f93f39d2-bac3-40a4-ab15-5b45b19965ac","Type":"ContainerDied","Data":"8819e5cc27e5b83f737e4c10d5fed208803736c9de1b6298fb3b5043044d440b"} Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.956625 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8819e5cc27e5b83f737e4c10d5fed208803736c9de1b6298fb3b5043044d440b" Mar 20 13:52:05 crc kubenswrapper[4849]: I0320 13:52:05.956159 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-t8hbm" Mar 20 13:52:06 crc kubenswrapper[4849]: I0320 13:52:06.359339 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-4pc5f"] Mar 20 13:52:06 crc kubenswrapper[4849]: I0320 13:52:06.366729 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-4pc5f"] Mar 20 13:52:07 crc kubenswrapper[4849]: I0320 13:52:07.046979 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91dea7c8-c5a7-4c12-8e7b-8477fededeb5" path="/var/lib/kubelet/pods/91dea7c8-c5a7-4c12-8e7b-8477fededeb5/volumes" Mar 20 13:52:16 crc kubenswrapper[4849]: I0320 13:52:16.036017 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:52:16 crc kubenswrapper[4849]: E0320 13:52:16.036626 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.036453 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:52:30 crc kubenswrapper[4849]: E0320 13:52:30.037279 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.046187 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5mbs6"] Mar 20 13:52:30 crc kubenswrapper[4849]: E0320 13:52:30.046573 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93f39d2-bac3-40a4-ab15-5b45b19965ac" containerName="oc" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.046583 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93f39d2-bac3-40a4-ab15-5b45b19965ac" containerName="oc" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.046804 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93f39d2-bac3-40a4-ab15-5b45b19965ac" containerName="oc" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.048049 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.069543 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mbs6"] Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.096938 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-utilities\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.097323 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-catalog-content\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.097378 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlmk\" (UniqueName: \"kubernetes.io/projected/94695cd3-0a50-4dd3-b1c0-6c829cd10790-kube-api-access-krlmk\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.199913 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-utilities\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.200045 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-catalog-content\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.200071 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krlmk\" (UniqueName: \"kubernetes.io/projected/94695cd3-0a50-4dd3-b1c0-6c829cd10790-kube-api-access-krlmk\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.200470 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-catalog-content\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.200692 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-utilities\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.224586 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlmk\" (UniqueName: \"kubernetes.io/projected/94695cd3-0a50-4dd3-b1c0-6c829cd10790-kube-api-access-krlmk\") pod \"redhat-marketplace-5mbs6\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.368410 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:30 crc kubenswrapper[4849]: I0320 13:52:30.828944 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mbs6"] Mar 20 13:52:31 crc kubenswrapper[4849]: I0320 13:52:31.205954 4849 generic.go:334] "Generic (PLEG): container finished" podID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerID="ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b" exitCode=0 Mar 20 13:52:31 crc kubenswrapper[4849]: I0320 13:52:31.206023 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mbs6" event={"ID":"94695cd3-0a50-4dd3-b1c0-6c829cd10790","Type":"ContainerDied","Data":"ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b"} Mar 20 13:52:31 crc kubenswrapper[4849]: I0320 13:52:31.206063 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mbs6" event={"ID":"94695cd3-0a50-4dd3-b1c0-6c829cd10790","Type":"ContainerStarted","Data":"a2672f3fa7449e9da62d99f0794c654886621de86fdf06ae04f8e4af3dac7d19"} Mar 20 13:52:32 crc kubenswrapper[4849]: I0320 13:52:32.215509 4849 generic.go:334] "Generic (PLEG): container finished" podID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerID="63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd" exitCode=0 Mar 20 13:52:32 crc kubenswrapper[4849]: I0320 13:52:32.215597 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mbs6" event={"ID":"94695cd3-0a50-4dd3-b1c0-6c829cd10790","Type":"ContainerDied","Data":"63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd"} Mar 20 13:52:37 crc kubenswrapper[4849]: I0320 13:52:37.262047 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mbs6" event={"ID":"94695cd3-0a50-4dd3-b1c0-6c829cd10790","Type":"ContainerStarted","Data":"7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7"} Mar 20 13:52:37 crc kubenswrapper[4849]: I0320 13:52:37.287621 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5mbs6" podStartSLOduration=1.830981502 podStartE2EDuration="7.287603025s" podCreationTimestamp="2026-03-20 13:52:30 +0000 UTC" firstStartedPulling="2026-03-20 13:52:31.207848751 +0000 UTC m=+1700.885572146" lastFinishedPulling="2026-03-20 13:52:36.664470274 +0000 UTC m=+1706.342193669" observedRunningTime="2026-03-20 13:52:37.28279174 +0000 UTC m=+1706.960515145" watchObservedRunningTime="2026-03-20 13:52:37.287603025 +0000 UTC m=+1706.965326420" Mar 20 13:52:40 crc kubenswrapper[4849]: I0320 13:52:40.368507 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:40 crc kubenswrapper[4849]: I0320 13:52:40.368874 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:40 crc kubenswrapper[4849]: I0320 13:52:40.413499 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:41 crc kubenswrapper[4849]: I0320 13:52:41.353034 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:41 crc kubenswrapper[4849]: I0320 13:52:41.418274 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mbs6"] Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.035593 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:52:43 crc kubenswrapper[4849]: E0320 13:52:43.036173 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.307875 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5mbs6" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="registry-server" containerID="cri-o://7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7" gracePeriod=2 Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.447486 4849 scope.go:117] "RemoveContainer" containerID="2049386afc5409d1f4d192768a19d21afc8294cd5dcc96d60464a0b509e0003a" Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.777447 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.978642 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-utilities\") pod \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.978714 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krlmk\" (UniqueName: \"kubernetes.io/projected/94695cd3-0a50-4dd3-b1c0-6c829cd10790-kube-api-access-krlmk\") pod \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.978795 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-catalog-content\") pod \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\" (UID: \"94695cd3-0a50-4dd3-b1c0-6c829cd10790\") " Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.979632 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-utilities" (OuterVolumeSpecName: "utilities") pod "94695cd3-0a50-4dd3-b1c0-6c829cd10790" (UID: "94695cd3-0a50-4dd3-b1c0-6c829cd10790"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:43 crc kubenswrapper[4849]: I0320 13:52:43.984579 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94695cd3-0a50-4dd3-b1c0-6c829cd10790-kube-api-access-krlmk" (OuterVolumeSpecName: "kube-api-access-krlmk") pod "94695cd3-0a50-4dd3-b1c0-6c829cd10790" (UID: "94695cd3-0a50-4dd3-b1c0-6c829cd10790"). InnerVolumeSpecName "kube-api-access-krlmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.004618 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94695cd3-0a50-4dd3-b1c0-6c829cd10790" (UID: "94695cd3-0a50-4dd3-b1c0-6c829cd10790"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.081567 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.081598 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krlmk\" (UniqueName: \"kubernetes.io/projected/94695cd3-0a50-4dd3-b1c0-6c829cd10790-kube-api-access-krlmk\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.081609 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94695cd3-0a50-4dd3-b1c0-6c829cd10790-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.321092 4849 generic.go:334] "Generic (PLEG): container finished" podID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerID="7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7" exitCode=0 Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.321141 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mbs6" event={"ID":"94695cd3-0a50-4dd3-b1c0-6c829cd10790","Type":"ContainerDied","Data":"7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7"} Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.321169 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mbs6" event={"ID":"94695cd3-0a50-4dd3-b1c0-6c829cd10790","Type":"ContainerDied","Data":"a2672f3fa7449e9da62d99f0794c654886621de86fdf06ae04f8e4af3dac7d19"} Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.321171 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mbs6" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.321189 4849 scope.go:117] "RemoveContainer" containerID="7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.337771 4849 scope.go:117] "RemoveContainer" containerID="63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.371561 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mbs6"] Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.383215 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mbs6"] Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.394091 4849 scope.go:117] "RemoveContainer" containerID="ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.422042 4849 scope.go:117] "RemoveContainer" containerID="7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7" Mar 20 13:52:44 crc kubenswrapper[4849]: E0320 13:52:44.422646 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7\": container with ID starting with 7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7 not found: ID does not exist" containerID="7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.422711 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7"} err="failed to get container status \"7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7\": rpc error: code = NotFound desc = could not find container \"7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7\": container with ID starting with 7d86bd811c8ad31212ec746503de035135befa20b8507e18891f91b36ab80ce7 not found: ID does not exist" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.422789 4849 scope.go:117] "RemoveContainer" containerID="63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd" Mar 20 13:52:44 crc kubenswrapper[4849]: E0320 13:52:44.423718 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd\": container with ID starting with 63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd not found: ID does not exist" containerID="63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.423762 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd"} err="failed to get container status \"63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd\": rpc error: code = NotFound desc = could not find container \"63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd\": container with ID starting with 63da59d4169c6906125d753a5ec721265ec47c286da8ee6feee85f13dcfe5ecd not found: ID does not exist" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.423795 4849 scope.go:117] "RemoveContainer" containerID="ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b" Mar 20 13:52:44 crc kubenswrapper[4849]: E0320 13:52:44.424116 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b\": container with ID starting with ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b not found: ID does not exist" containerID="ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b" Mar 20 13:52:44 crc kubenswrapper[4849]: I0320 13:52:44.424153 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b"} err="failed to get container status \"ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b\": rpc error: code = NotFound desc = could not find container \"ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b\": container with ID starting with ed2a8ab3224682dfce6b4d0383179189713b62a2b2f1b205fe64858e0fd7969b not found: ID does not exist" Mar 20 13:52:45 crc kubenswrapper[4849]: I0320 13:52:45.047602 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" path="/var/lib/kubelet/pods/94695cd3-0a50-4dd3-b1c0-6c829cd10790/volumes" Mar 20 13:52:54 crc kubenswrapper[4849]: I0320 13:52:54.035856 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:52:54 crc kubenswrapper[4849]: E0320 13:52:54.036444 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:53:06 crc kubenswrapper[4849]: I0320 13:53:06.037105 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:53:06 crc kubenswrapper[4849]: E0320 13:53:06.038371 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:53:08 crc kubenswrapper[4849]: I0320 13:53:08.049117 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-pjvhk"] Mar 20 13:53:08 crc kubenswrapper[4849]: I0320 13:53:08.058530 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-pjvhk"] Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.050312 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b9c6f1-f1c5-4310-9c95-649b730470a5" path="/var/lib/kubelet/pods/81b9c6f1-f1c5-4310-9c95-649b730470a5/volumes" Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.051378 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qft7p"] Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.051404 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0a99-account-create-update-8k85v"] Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.053970 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0a99-account-create-update-8k85v"] Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.063589 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qft7p"] Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.071557 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3d75-account-create-update-6bd9j"] Mar 20 13:53:09 crc kubenswrapper[4849]: I0320 13:53:09.078407 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3d75-account-create-update-6bd9j"] Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.029463 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9d498"] Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.052524 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b28b324-7675-41b1-b1af-e37801c55af0" path="/var/lib/kubelet/pods/0b28b324-7675-41b1-b1af-e37801c55af0/volumes" Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.053458 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481f5fb9-0040-4372-96b7-15e549dab23a" path="/var/lib/kubelet/pods/481f5fb9-0040-4372-96b7-15e549dab23a/volumes" Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.054155 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0fd8a46-86a6-403d-b740-ddd048bdc4b0" path="/var/lib/kubelet/pods/c0fd8a46-86a6-403d-b740-ddd048bdc4b0/volumes" Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.054790 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fdbf-account-create-update-2xkp5"] Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.054845 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9d498"] Mar 20 13:53:11 crc kubenswrapper[4849]: I0320 13:53:11.058169 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fdbf-account-create-update-2xkp5"] Mar 20 13:53:13 crc kubenswrapper[4849]: I0320 13:53:13.047074 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69beefe1-45de-469f-a3af-e42a88b38309" path="/var/lib/kubelet/pods/69beefe1-45de-469f-a3af-e42a88b38309/volumes" Mar 20 13:53:13 crc kubenswrapper[4849]: I0320 13:53:13.048125 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a03006-5384-4542-8b30-dc8bea37c96a" path="/var/lib/kubelet/pods/b0a03006-5384-4542-8b30-dc8bea37c96a/volumes" Mar 20 13:53:14 crc kubenswrapper[4849]: I0320 13:53:14.600490 4849 generic.go:334] "Generic (PLEG): container finished" podID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerID="9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f" exitCode=0 Mar 20 13:53:14 crc kubenswrapper[4849]: I0320 13:53:14.600596 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cj48j/must-gather-tdtwd" event={"ID":"d7e6493a-981c-4e06-ade3-77a34e3da785","Type":"ContainerDied","Data":"9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f"} Mar 20 13:53:14 crc kubenswrapper[4849]: I0320 13:53:14.601986 4849 scope.go:117] "RemoveContainer" containerID="9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f" Mar 20 13:53:15 crc kubenswrapper[4849]: I0320 13:53:15.137782 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cj48j_must-gather-tdtwd_d7e6493a-981c-4e06-ade3-77a34e3da785/gather/0.log" Mar 20 13:53:19 crc kubenswrapper[4849]: I0320 13:53:19.036527 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:53:19 crc kubenswrapper[4849]: E0320 13:53:19.037406 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:53:22 crc kubenswrapper[4849]: I0320 13:53:22.596956 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cj48j/must-gather-tdtwd"] Mar 20 13:53:22 crc kubenswrapper[4849]: I0320 13:53:22.597987 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cj48j/must-gather-tdtwd" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="copy" containerID="cri-o://b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150" gracePeriod=2 Mar 20 13:53:22 crc kubenswrapper[4849]: I0320 13:53:22.614586 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cj48j/must-gather-tdtwd"] Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.018946 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cj48j_must-gather-tdtwd_d7e6493a-981c-4e06-ade3-77a34e3da785/copy/0.log" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.019803 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.143854 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e6493a-981c-4e06-ade3-77a34e3da785-must-gather-output\") pod \"d7e6493a-981c-4e06-ade3-77a34e3da785\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.143918 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n65z\" (UniqueName: \"kubernetes.io/projected/d7e6493a-981c-4e06-ade3-77a34e3da785-kube-api-access-8n65z\") pod \"d7e6493a-981c-4e06-ade3-77a34e3da785\" (UID: \"d7e6493a-981c-4e06-ade3-77a34e3da785\") " Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.161846 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e6493a-981c-4e06-ade3-77a34e3da785-kube-api-access-8n65z" (OuterVolumeSpecName: "kube-api-access-8n65z") pod "d7e6493a-981c-4e06-ade3-77a34e3da785" (UID: "d7e6493a-981c-4e06-ade3-77a34e3da785"). InnerVolumeSpecName "kube-api-access-8n65z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.247477 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n65z\" (UniqueName: \"kubernetes.io/projected/d7e6493a-981c-4e06-ade3-77a34e3da785-kube-api-access-8n65z\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.273275 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e6493a-981c-4e06-ade3-77a34e3da785-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d7e6493a-981c-4e06-ade3-77a34e3da785" (UID: "d7e6493a-981c-4e06-ade3-77a34e3da785"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.349026 4849 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e6493a-981c-4e06-ade3-77a34e3da785-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.705600 4849 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cj48j_must-gather-tdtwd_d7e6493a-981c-4e06-ade3-77a34e3da785/copy/0.log" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.706674 4849 generic.go:334] "Generic (PLEG): container finished" podID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerID="b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150" exitCode=143 Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.706739 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cj48j/must-gather-tdtwd" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.706788 4849 scope.go:117] "RemoveContainer" containerID="b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.732630 4849 scope.go:117] "RemoveContainer" containerID="9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.809697 4849 scope.go:117] "RemoveContainer" containerID="b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150" Mar 20 13:53:23 crc kubenswrapper[4849]: E0320 13:53:23.810185 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150\": container with ID starting with b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150 not found: ID does not exist" containerID="b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.810229 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150"} err="failed to get container status \"b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150\": rpc error: code = NotFound desc = could not find container \"b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150\": container with ID starting with b7263ab807f03b43008122303704596487eb4c83e58b4cbb71c2be4ad509d150 not found: ID does not exist" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.810258 4849 scope.go:117] "RemoveContainer" containerID="9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f" Mar 20 13:53:23 crc kubenswrapper[4849]: E0320 13:53:23.810600 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f\": container with ID starting with 9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f not found: ID does not exist" containerID="9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f" Mar 20 13:53:23 crc kubenswrapper[4849]: I0320 13:53:23.810641 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f"} err="failed to get container status \"9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f\": rpc error: code = NotFound desc = could not find container \"9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f\": container with ID starting with 9ff2cf949dd1914b91753a1b62ad8fb7c130b711486902a840bfc042825b137f not found: ID does not exist" Mar 20 13:53:25 crc kubenswrapper[4849]: I0320 13:53:25.045162 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" path="/var/lib/kubelet/pods/d7e6493a-981c-4e06-ade3-77a34e3da785/volumes" Mar 20 13:53:28 crc kubenswrapper[4849]: I0320 13:53:28.037760 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k72fr"] Mar 20 13:53:28 crc kubenswrapper[4849]: I0320 13:53:28.045675 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k72fr"] Mar 20 13:53:29 crc kubenswrapper[4849]: I0320 13:53:29.045743 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef32d779-a195-46ae-9112-3ecdbfe73a1e" path="/var/lib/kubelet/pods/ef32d779-a195-46ae-9112-3ecdbfe73a1e/volumes" Mar 20 13:53:33 crc kubenswrapper[4849]: I0320 13:53:33.037156 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:53:33 crc kubenswrapper[4849]: E0320 13:53:33.037965 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:53:34 crc kubenswrapper[4849]: I0320 13:53:34.032511 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-66s8p"] Mar 20 13:53:34 crc kubenswrapper[4849]: I0320 13:53:34.039865 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-66s8p"] Mar 20 13:53:35 crc kubenswrapper[4849]: I0320 13:53:35.050970 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4baaa4a5-7434-40f0-bfee-185b7fc4fafb" path="/var/lib/kubelet/pods/4baaa4a5-7434-40f0-bfee-185b7fc4fafb/volumes" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.536977 4849 scope.go:117] "RemoveContainer" containerID="ff1e4d29f3ee14059efcbb76cce1d29efadb9d830a66c69dd1051995ea2dbf8d" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.567599 4849 scope.go:117] "RemoveContainer" containerID="47c60bc3417b34de6c722ab64bc3d05ee39f4c0262b7596e940a2d97a48a9026" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.662870 4849 scope.go:117] "RemoveContainer" containerID="fb4ffd092b417824dce96e2e08c24ef2c022bf8bb0d70888a05443d8404cdfe1" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.690365 4849 scope.go:117] "RemoveContainer" containerID="5cbd99f599b63b50294589943663a22165a805730ac589805dd1bcd01b8a9ce9" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.754427 4849 scope.go:117] "RemoveContainer" containerID="90bd408c86ca7b27de8160bc1ceee85217671e011c05722d93b404ae39bda19f" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.777584 4849 scope.go:117] "RemoveContainer" containerID="76061c666572f1825eb1147c473b1cd80697afb3b14af268eb9aa157da8de120" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.821349 4849 scope.go:117] "RemoveContainer" containerID="140f4f346af43508331dc6f7f986eb57d97ee9144909d9e6a822159124442cfb" Mar 20 13:53:43 crc kubenswrapper[4849]: I0320 13:53:43.843958 4849 scope.go:117] "RemoveContainer" containerID="1a88a38842acb2f5623a2178083ec136d1bc6ee32eefbf4ffeef8c6c3f024a07" Mar 20 13:53:48 crc kubenswrapper[4849]: I0320 13:53:48.036764 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:53:48 crc kubenswrapper[4849]: E0320 13:53:48.037457 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:53:53 crc kubenswrapper[4849]: I0320 13:53:53.050694 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2589-account-create-update-q8rmz"] Mar 20 13:53:53 crc kubenswrapper[4849]: I0320 13:53:53.057448 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gqdxs"] Mar 20 13:53:53 crc kubenswrapper[4849]: I0320 13:53:53.068415 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bxj2z"] Mar 20 13:53:53 crc kubenswrapper[4849]: I0320 13:53:53.076550 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gqdxs"] Mar 20 13:53:53 crc kubenswrapper[4849]: I0320 13:53:53.083921 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2589-account-create-update-q8rmz"] Mar 20 13:53:53 crc kubenswrapper[4849]: I0320 13:53:53.091400 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bxj2z"] Mar 20 13:53:55 crc kubenswrapper[4849]: I0320 13:53:55.050029 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5fb05e-2f40-432f-acf5-068f32e62698" path="/var/lib/kubelet/pods/5b5fb05e-2f40-432f-acf5-068f32e62698/volumes" Mar 20 13:53:55 crc kubenswrapper[4849]: I0320 13:53:55.051444 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6155d2c7-33f5-4bbb-b6a0-a378848a08e5" path="/var/lib/kubelet/pods/6155d2c7-33f5-4bbb-b6a0-a378848a08e5/volumes" Mar 20 13:53:55 crc kubenswrapper[4849]: I0320 13:53:55.052582 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66983f57-dfbe-4c47-90f6-9eef82ebd9a1" path="/var/lib/kubelet/pods/66983f57-dfbe-4c47-90f6-9eef82ebd9a1/volumes" Mar 20 13:53:57 crc kubenswrapper[4849]: I0320 13:53:57.035570 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-jdcns"] Mar 20 13:53:57 crc kubenswrapper[4849]: I0320 13:53:57.050217 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1a7e-account-create-update-lqd64"] Mar 20 13:53:57 crc kubenswrapper[4849]: I0320 13:53:57.058495 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-jdcns"] Mar 20 13:53:57 crc kubenswrapper[4849]: I0320 13:53:57.065659 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1a7e-account-create-update-lqd64"] Mar 20 13:53:57 crc kubenswrapper[4849]: I0320 13:53:57.072522 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e6b6-account-create-update-bxjkb"] Mar 20 13:53:57 crc kubenswrapper[4849]: I0320 13:53:57.080259 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e6b6-account-create-update-bxjkb"] Mar 20 13:53:59 crc kubenswrapper[4849]: I0320 13:53:59.046627 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c" path="/var/lib/kubelet/pods/01152ef5-8fb0-44bd-aa3d-6a1e8a4e2f1c/volumes" Mar 20 13:53:59 crc kubenswrapper[4849]: I0320 13:53:59.047218 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05bc515e-52ae-4e93-b967-a458d135ae12" path="/var/lib/kubelet/pods/05bc515e-52ae-4e93-b967-a458d135ae12/volumes" Mar 20 13:53:59 crc kubenswrapper[4849]: I0320 13:53:59.047697 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d1422a9-d5f8-4349-8513-0bd372fa8500" path="/var/lib/kubelet/pods/3d1422a9-d5f8-4349-8513-0bd372fa8500/volumes" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.035476 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:54:00 crc kubenswrapper[4849]: E0320 13:54:00.035871 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.139921 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566914-2p7ff"] Mar 20 13:54:00 crc kubenswrapper[4849]: E0320 13:54:00.140350 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="extract-utilities" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140364 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="extract-utilities" Mar 20 13:54:00 crc kubenswrapper[4849]: E0320 13:54:00.140378 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="gather" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140383 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="gather" Mar 20 13:54:00 crc kubenswrapper[4849]: E0320 13:54:00.140395 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="registry-server" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140401 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="registry-server" Mar 20 13:54:00 crc kubenswrapper[4849]: E0320 13:54:00.140415 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="extract-content" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140421 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="extract-content" Mar 20 13:54:00 crc kubenswrapper[4849]: E0320 13:54:00.140436 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="copy" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140441 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="copy" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140627 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="gather" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140657 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e6493a-981c-4e06-ade3-77a34e3da785" containerName="copy" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.140680 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="94695cd3-0a50-4dd3-b1c0-6c829cd10790" containerName="registry-server" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.143757 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.146464 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.146635 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.150220 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.155235 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-2p7ff"] Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.245079 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4jp\" (UniqueName: \"kubernetes.io/projected/f23489a5-0d39-4be3-b776-3e10e7acc7d1-kube-api-access-km4jp\") pod \"auto-csr-approver-29566914-2p7ff\" (UID: \"f23489a5-0d39-4be3-b776-3e10e7acc7d1\") " pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.346036 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4jp\" (UniqueName: \"kubernetes.io/projected/f23489a5-0d39-4be3-b776-3e10e7acc7d1-kube-api-access-km4jp\") pod \"auto-csr-approver-29566914-2p7ff\" (UID: \"f23489a5-0d39-4be3-b776-3e10e7acc7d1\") " pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.363977 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4jp\" (UniqueName: \"kubernetes.io/projected/f23489a5-0d39-4be3-b776-3e10e7acc7d1-kube-api-access-km4jp\") pod \"auto-csr-approver-29566914-2p7ff\" (UID: \"f23489a5-0d39-4be3-b776-3e10e7acc7d1\") " pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.480327 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:00 crc kubenswrapper[4849]: I0320 13:54:00.930534 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-2p7ff"] Mar 20 13:54:01 crc kubenswrapper[4849]: I0320 13:54:01.032356 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xvqnp"] Mar 20 13:54:01 crc kubenswrapper[4849]: I0320 13:54:01.056650 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xvqnp"] Mar 20 13:54:01 crc kubenswrapper[4849]: I0320 13:54:01.056697 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" event={"ID":"f23489a5-0d39-4be3-b776-3e10e7acc7d1","Type":"ContainerStarted","Data":"9d1c98214d8e2f9a224f993d30e023a1db977d11811aec27465fa3da8e15fc29"} Mar 20 13:54:03 crc kubenswrapper[4849]: I0320 13:54:03.051459 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6129a249-c10a-4299-90c6-147c58b4926e" path="/var/lib/kubelet/pods/6129a249-c10a-4299-90c6-147c58b4926e/volumes" Mar 20 13:54:03 crc kubenswrapper[4849]: I0320 13:54:03.074243 4849 generic.go:334] "Generic (PLEG): container finished" podID="f23489a5-0d39-4be3-b776-3e10e7acc7d1" containerID="5d7cbf6f4c35a2d1a337c528e20f3ec0530939350aa707337ed79bed85ef19a6" exitCode=0 Mar 20 13:54:03 crc kubenswrapper[4849]: I0320 13:54:03.074335 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" event={"ID":"f23489a5-0d39-4be3-b776-3e10e7acc7d1","Type":"ContainerDied","Data":"5d7cbf6f4c35a2d1a337c528e20f3ec0530939350aa707337ed79bed85ef19a6"} Mar 20 13:54:04 crc kubenswrapper[4849]: I0320 13:54:04.460784 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:04 crc kubenswrapper[4849]: I0320 13:54:04.531690 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km4jp\" (UniqueName: \"kubernetes.io/projected/f23489a5-0d39-4be3-b776-3e10e7acc7d1-kube-api-access-km4jp\") pod \"f23489a5-0d39-4be3-b776-3e10e7acc7d1\" (UID: \"f23489a5-0d39-4be3-b776-3e10e7acc7d1\") " Mar 20 13:54:04 crc kubenswrapper[4849]: I0320 13:54:04.538010 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f23489a5-0d39-4be3-b776-3e10e7acc7d1-kube-api-access-km4jp" (OuterVolumeSpecName: "kube-api-access-km4jp") pod "f23489a5-0d39-4be3-b776-3e10e7acc7d1" (UID: "f23489a5-0d39-4be3-b776-3e10e7acc7d1"). InnerVolumeSpecName "kube-api-access-km4jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:04 crc kubenswrapper[4849]: I0320 13:54:04.634278 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km4jp\" (UniqueName: \"kubernetes.io/projected/f23489a5-0d39-4be3-b776-3e10e7acc7d1-kube-api-access-km4jp\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:05 crc kubenswrapper[4849]: I0320 13:54:05.098688 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" event={"ID":"f23489a5-0d39-4be3-b776-3e10e7acc7d1","Type":"ContainerDied","Data":"9d1c98214d8e2f9a224f993d30e023a1db977d11811aec27465fa3da8e15fc29"} Mar 20 13:54:05 crc kubenswrapper[4849]: I0320 13:54:05.098730 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d1c98214d8e2f9a224f993d30e023a1db977d11811aec27465fa3da8e15fc29" Mar 20 13:54:05 crc kubenswrapper[4849]: I0320 13:54:05.098736 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-2p7ff" Mar 20 13:54:05 crc kubenswrapper[4849]: I0320 13:54:05.525976 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-fp2m7"] Mar 20 13:54:05 crc kubenswrapper[4849]: I0320 13:54:05.533857 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-fp2m7"] Mar 20 13:54:07 crc kubenswrapper[4849]: I0320 13:54:07.046273 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6bf630-68ee-40f6-831b-feb110e2bc2e" path="/var/lib/kubelet/pods/3c6bf630-68ee-40f6-831b-feb110e2bc2e/volumes" Mar 20 13:54:12 crc kubenswrapper[4849]: I0320 13:54:12.036653 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:54:12 crc kubenswrapper[4849]: E0320 13:54:12.037397 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:54:26 crc kubenswrapper[4849]: I0320 13:54:26.036181 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:54:26 crc kubenswrapper[4849]: E0320 13:54:26.037077 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:54:31 crc kubenswrapper[4849]: I0320 13:54:31.050607 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-wh76b"] Mar 20 13:54:31 crc kubenswrapper[4849]: I0320 13:54:31.058528 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-wh76b"] Mar 20 13:54:33 crc kubenswrapper[4849]: I0320 13:54:33.048206 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a5cf24-7a8d-40f9-87cc-0b9b6533e520" path="/var/lib/kubelet/pods/c2a5cf24-7a8d-40f9-87cc-0b9b6533e520/volumes" Mar 20 13:54:38 crc kubenswrapper[4849]: I0320 13:54:38.035813 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:54:38 crc kubenswrapper[4849]: E0320 13:54:38.036495 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:54:39 crc kubenswrapper[4849]: I0320 13:54:39.027699 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-xbwzw"] Mar 20 13:54:39 crc kubenswrapper[4849]: I0320 13:54:39.051901 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-xbwzw"] Mar 20 13:54:41 crc kubenswrapper[4849]: I0320 13:54:41.032367 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6qksz"] Mar 20 13:54:41 crc kubenswrapper[4849]: I0320 13:54:41.052148 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39790e43-e227-4e13-8054-995e12255ec8" path="/var/lib/kubelet/pods/39790e43-e227-4e13-8054-995e12255ec8/volumes" Mar 20 13:54:41 crc kubenswrapper[4849]: I0320 13:54:41.052949 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6qksz"] Mar 20 13:54:43 crc kubenswrapper[4849]: I0320 13:54:43.055181 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f18d572-488f-4e4e-9596-3b99b5298123" path="/var/lib/kubelet/pods/6f18d572-488f-4e4e-9596-3b99b5298123/volumes" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.038277 4849 scope.go:117] "RemoveContainer" containerID="7ed8622f4b99236264791f2ed048fa77c650e00cce26a8711021fdbef54e5b37" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.087050 4849 scope.go:117] "RemoveContainer" containerID="c406957d2efda55ad10cbef7a42d1d202eeb9461998ac4439beb54986ace02f6" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.130804 4849 scope.go:117] "RemoveContainer" containerID="23d12cc0933c0ce1719047e9148b5861d1359d7aacd36f1966df954a69ecf56f" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.209347 4849 scope.go:117] "RemoveContainer" containerID="7accee13eab298a882b99f170b6f179d6166c223602c4f9609540aad21013052" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.247610 4849 scope.go:117] "RemoveContainer" containerID="7f3cb2db57e1ff97b4333f399e9f9b2263113c1cda171a7408e89a733fef5fd3" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.287226 4849 scope.go:117] "RemoveContainer" containerID="49d2c53933c500e513c072ee38587bcca5460421ba255a31fb2a0a36d5b59b82" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.324246 4849 scope.go:117] "RemoveContainer" containerID="b724a256287ef2cf0c43f3ff39747afd41fe83ec47e27847686868b9aebdfe10" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.349159 4849 scope.go:117] "RemoveContainer" containerID="d3013a21b55879ce171f0042ee5d7803ef2c391a92af0feebb0c4c7ca517b993" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.377014 4849 scope.go:117] "RemoveContainer" containerID="dd1b63c949713675b1f02a553c25e128ba4e4f58d0e7c7479148e78cc5bf860b" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.397222 4849 scope.go:117] "RemoveContainer" containerID="2aceb57fe05e593b5cca1609dc37957848d7dfe43e49ce228d6c9593c262551c" Mar 20 13:54:44 crc kubenswrapper[4849]: I0320 13:54:44.419884 4849 scope.go:117] "RemoveContainer" containerID="80dd9311d2d3c9a0256f25fa30c71e71b906754fd19ca64fafe7cba22f93c606" Mar 20 13:54:53 crc kubenswrapper[4849]: I0320 13:54:53.036379 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:54:53 crc kubenswrapper[4849]: E0320 13:54:53.037100 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:54:53 crc kubenswrapper[4849]: I0320 13:54:53.064668 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gwq28"] Mar 20 13:54:53 crc kubenswrapper[4849]: I0320 13:54:53.067525 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gwq28"] Mar 20 13:54:55 crc kubenswrapper[4849]: I0320 13:54:55.047033 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9399c2-4755-4acd-8514-7d49cdd92f16" path="/var/lib/kubelet/pods/ee9399c2-4755-4acd-8514-7d49cdd92f16/volumes" Mar 20 13:54:57 crc kubenswrapper[4849]: I0320 13:54:57.024489 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jk575"] Mar 20 13:54:57 crc kubenswrapper[4849]: I0320 13:54:57.032434 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jk575"] Mar 20 13:54:57 crc kubenswrapper[4849]: I0320 13:54:57.049404 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701dfbaa-ecac-4290-9402-90c866ccd108" path="/var/lib/kubelet/pods/701dfbaa-ecac-4290-9402-90c866ccd108/volumes" Mar 20 13:55:04 crc kubenswrapper[4849]: I0320 13:55:04.036265 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:55:04 crc kubenswrapper[4849]: E0320 13:55:04.036949 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:55:15 crc kubenswrapper[4849]: I0320 13:55:15.036204 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:55:15 crc kubenswrapper[4849]: E0320 13:55:15.038071 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.250286 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-28x88"] Mar 20 13:55:27 crc kubenswrapper[4849]: E0320 13:55:27.251361 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f23489a5-0d39-4be3-b776-3e10e7acc7d1" containerName="oc" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.251374 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="f23489a5-0d39-4be3-b776-3e10e7acc7d1" containerName="oc" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.251626 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="f23489a5-0d39-4be3-b776-3e10e7acc7d1" containerName="oc" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.253301 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.259597 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28x88"] Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.404676 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpzx\" (UniqueName: \"kubernetes.io/projected/18e5eaad-1021-4efb-bff6-042b1bfb64b9-kube-api-access-pvpzx\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.405172 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-catalog-content\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.405262 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-utilities\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.507169 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-catalog-content\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.507449 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-utilities\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.507575 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpzx\" (UniqueName: \"kubernetes.io/projected/18e5eaad-1021-4efb-bff6-042b1bfb64b9-kube-api-access-pvpzx\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.507671 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-catalog-content\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.507728 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-utilities\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.539226 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpzx\" (UniqueName: \"kubernetes.io/projected/18e5eaad-1021-4efb-bff6-042b1bfb64b9-kube-api-access-pvpzx\") pod \"redhat-operators-28x88\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:27 crc kubenswrapper[4849]: I0320 13:55:27.576271 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:28 crc kubenswrapper[4849]: I0320 13:55:28.036062 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:55:28 crc kubenswrapper[4849]: E0320 13:55:28.037016 4849 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pzdl_openshift-machine-config-operator(9aefa038-8804-4eff-b0a9-3d6ce4a47a6a)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" podUID="9aefa038-8804-4eff-b0a9-3d6ce4a47a6a" Mar 20 13:55:28 crc kubenswrapper[4849]: I0320 13:55:28.218774 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-28x88"] Mar 20 13:55:28 crc kubenswrapper[4849]: I0320 13:55:28.897846 4849 generic.go:334] "Generic (PLEG): container finished" podID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerID="1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472" exitCode=0 Mar 20 13:55:28 crc kubenswrapper[4849]: I0320 13:55:28.897942 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28x88" event={"ID":"18e5eaad-1021-4efb-bff6-042b1bfb64b9","Type":"ContainerDied","Data":"1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472"} Mar 20 13:55:28 crc kubenswrapper[4849]: I0320 13:55:28.898110 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28x88" event={"ID":"18e5eaad-1021-4efb-bff6-042b1bfb64b9","Type":"ContainerStarted","Data":"5fdf4c78716c1b1006f9984dded256e1e28d0eb45ffde1c51323143b335e87e4"} Mar 20 13:55:28 crc kubenswrapper[4849]: I0320 13:55:28.900017 4849 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:55:30 crc kubenswrapper[4849]: I0320 13:55:30.915928 4849 generic.go:334] "Generic (PLEG): container finished" podID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerID="49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59" exitCode=0 Mar 20 13:55:30 crc kubenswrapper[4849]: I0320 13:55:30.915980 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28x88" event={"ID":"18e5eaad-1021-4efb-bff6-042b1bfb64b9","Type":"ContainerDied","Data":"49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59"} Mar 20 13:55:31 crc kubenswrapper[4849]: I0320 13:55:31.926959 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28x88" event={"ID":"18e5eaad-1021-4efb-bff6-042b1bfb64b9","Type":"ContainerStarted","Data":"b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b"} Mar 20 13:55:31 crc kubenswrapper[4849]: I0320 13:55:31.944709 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-28x88" podStartSLOduration=2.543104217 podStartE2EDuration="4.944687879s" podCreationTimestamp="2026-03-20 13:55:27 +0000 UTC" firstStartedPulling="2026-03-20 13:55:28.899797605 +0000 UTC m=+1878.577521000" lastFinishedPulling="2026-03-20 13:55:31.301381267 +0000 UTC m=+1880.979104662" observedRunningTime="2026-03-20 13:55:31.941764142 +0000 UTC m=+1881.619487537" watchObservedRunningTime="2026-03-20 13:55:31.944687879 +0000 UTC m=+1881.622411274" Mar 20 13:55:35 crc kubenswrapper[4849]: I0320 13:55:35.841466 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9mv8"] Mar 20 13:55:35 crc kubenswrapper[4849]: I0320 13:55:35.844108 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:35 crc kubenswrapper[4849]: I0320 13:55:35.898421 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9mv8"] Mar 20 13:55:35 crc kubenswrapper[4849]: I0320 13:55:35.982749 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-utilities\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:35 crc kubenswrapper[4849]: I0320 13:55:35.982854 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7x7k\" (UniqueName: \"kubernetes.io/projected/18e432a7-2e75-430d-92cd-718da222d369-kube-api-access-t7x7k\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:35 crc kubenswrapper[4849]: I0320 13:55:35.983152 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-catalog-content\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.084610 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-utilities\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.084677 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7x7k\" (UniqueName: \"kubernetes.io/projected/18e432a7-2e75-430d-92cd-718da222d369-kube-api-access-t7x7k\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.084777 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-catalog-content\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.085539 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-utilities\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.085558 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-catalog-content\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.113793 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7x7k\" (UniqueName: \"kubernetes.io/projected/18e432a7-2e75-430d-92cd-718da222d369-kube-api-access-t7x7k\") pod \"certified-operators-h9mv8\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.226716 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.712955 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9mv8"] Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.974231 4849 generic.go:334] "Generic (PLEG): container finished" podID="18e432a7-2e75-430d-92cd-718da222d369" containerID="b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f" exitCode=0 Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.974302 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mv8" event={"ID":"18e432a7-2e75-430d-92cd-718da222d369","Type":"ContainerDied","Data":"b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f"} Mar 20 13:55:36 crc kubenswrapper[4849]: I0320 13:55:36.974520 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mv8" event={"ID":"18e432a7-2e75-430d-92cd-718da222d369","Type":"ContainerStarted","Data":"f19e762490513d67b1b57edca714c853df3d6dcb4f01d4d5b2a2583651cc481e"} Mar 20 13:55:37 crc kubenswrapper[4849]: I0320 13:55:37.584116 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:37 crc kubenswrapper[4849]: I0320 13:55:37.584422 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:37 crc kubenswrapper[4849]: I0320 13:55:37.984204 4849 generic.go:334] "Generic (PLEG): container finished" podID="18e432a7-2e75-430d-92cd-718da222d369" containerID="9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0" exitCode=0 Mar 20 13:55:37 crc kubenswrapper[4849]: I0320 13:55:37.984245 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mv8" event={"ID":"18e432a7-2e75-430d-92cd-718da222d369","Type":"ContainerDied","Data":"9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0"} Mar 20 13:55:38 crc kubenswrapper[4849]: I0320 13:55:38.638300 4849 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-28x88" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:38 crc kubenswrapper[4849]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:38 crc kubenswrapper[4849]: > Mar 20 13:55:38 crc kubenswrapper[4849]: I0320 13:55:38.998618 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mv8" event={"ID":"18e432a7-2e75-430d-92cd-718da222d369","Type":"ContainerStarted","Data":"547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500"} Mar 20 13:55:39 crc kubenswrapper[4849]: I0320 13:55:39.020879 4849 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9mv8" podStartSLOduration=2.609539586 podStartE2EDuration="4.020861813s" podCreationTimestamp="2026-03-20 13:55:35 +0000 UTC" firstStartedPulling="2026-03-20 13:55:36.975510689 +0000 UTC m=+1886.653234084" lastFinishedPulling="2026-03-20 13:55:38.386832916 +0000 UTC m=+1888.064556311" observedRunningTime="2026-03-20 13:55:39.019152357 +0000 UTC m=+1888.696875752" watchObservedRunningTime="2026-03-20 13:55:39.020861813 +0000 UTC m=+1888.698585208" Mar 20 13:55:40 crc kubenswrapper[4849]: I0320 13:55:40.049188 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-2kmcz"] Mar 20 13:55:40 crc kubenswrapper[4849]: I0320 13:55:40.062220 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hgx5j"] Mar 20 13:55:40 crc kubenswrapper[4849]: I0320 13:55:40.097550 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-2kmcz"] Mar 20 13:55:40 crc kubenswrapper[4849]: I0320 13:55:40.105597 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-653c-account-create-update-7p7tb"] Mar 20 13:55:40 crc kubenswrapper[4849]: I0320 13:55:40.112601 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hgx5j"] Mar 20 13:55:40 crc kubenswrapper[4849]: I0320 13:55:40.118951 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-653c-account-create-update-7p7tb"] Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.031381 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c417-account-create-update-x9vgq"] Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.051281 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5336158a-f129-45cf-a73c-5e0733002023" path="/var/lib/kubelet/pods/5336158a-f129-45cf-a73c-5e0733002023/volumes" Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.052110 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d86487-6a56-429a-a4af-afc82ed6a843" path="/var/lib/kubelet/pods/b1d86487-6a56-429a-a4af-afc82ed6a843/volumes" Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.053004 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc762765-75e8-42df-bfd0-86cbad8172b3" path="/var/lib/kubelet/pods/fc762765-75e8-42df-bfd0-86cbad8172b3/volumes" Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.053727 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4lmj8"] Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.053767 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c417-account-create-update-x9vgq"] Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.061850 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4lmj8"] Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.071587 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5bc4-account-create-update-dggjx"] Mar 20 13:55:41 crc kubenswrapper[4849]: I0320 13:55:41.082793 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5bc4-account-create-update-dggjx"] Mar 20 13:55:43 crc kubenswrapper[4849]: I0320 13:55:43.036246 4849 scope.go:117] "RemoveContainer" containerID="068a52dbe80d9760137e54053133e62d37fd5fbd6bc3511497980af2ee6536a0" Mar 20 13:55:43 crc kubenswrapper[4849]: I0320 13:55:43.046306 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fafae6e-b99e-4561-8caa-84a392b5e463" path="/var/lib/kubelet/pods/5fafae6e-b99e-4561-8caa-84a392b5e463/volumes" Mar 20 13:55:43 crc kubenswrapper[4849]: I0320 13:55:43.047024 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84081e43-a4b1-4462-9b31-21d5d443d016" path="/var/lib/kubelet/pods/84081e43-a4b1-4462-9b31-21d5d443d016/volumes" Mar 20 13:55:43 crc kubenswrapper[4849]: I0320 13:55:43.047718 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fe63e8-731d-4c04-9679-25635974e8ce" path="/var/lib/kubelet/pods/d6fe63e8-731d-4c04-9679-25635974e8ce/volumes" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.039297 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pzdl" event={"ID":"9aefa038-8804-4eff-b0a9-3d6ce4a47a6a","Type":"ContainerStarted","Data":"24d870ae8f1f61f71f779ba924720eaca60168158d2416afb4e0ff7ea060ae97"} Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.605665 4849 scope.go:117] "RemoveContainer" containerID="9f9c7ef807bdbe8189489ac4538fbbe1b937cc89119c08518ef8d37fd4ed385a" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.636556 4849 scope.go:117] "RemoveContainer" containerID="98cb3e5a09830707adebabf8d1ec2cb833590015c9eba996562647114642f205" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.669260 4849 scope.go:117] "RemoveContainer" containerID="242c3caed3546905629881d9373bef7632d69447a8464addbbeb172f167eef6a" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.718013 4849 scope.go:117] "RemoveContainer" containerID="f0f839010b6717e6ff88a8ac36737355463b9967cfada5d839c52e8f21a81747" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.784001 4849 scope.go:117] "RemoveContainer" containerID="5f5e7b64fb88740552ddcefe1d16d6df85bd2a42ee1b612805991a0304c39adf" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.819688 4849 scope.go:117] "RemoveContainer" containerID="ff1b1f41e9211f3980105e4e2ceb94ed7dbd5707e99659148e01c814de2c1342" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.873696 4849 scope.go:117] "RemoveContainer" containerID="2d08047a2ee6678fdad94ab7e5aa89afb84af0849d91594a796ef4ec95434b50" Mar 20 13:55:44 crc kubenswrapper[4849]: I0320 13:55:44.888844 4849 scope.go:117] "RemoveContainer" containerID="ee8847e70eb8c7872f30a5b82c3d3210d7e9f84b912ec52e1657c62532e3b391" Mar 20 13:55:46 crc kubenswrapper[4849]: I0320 13:55:46.227235 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:46 crc kubenswrapper[4849]: I0320 13:55:46.229141 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:46 crc kubenswrapper[4849]: I0320 13:55:46.278161 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:47 crc kubenswrapper[4849]: I0320 13:55:47.110519 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:47 crc kubenswrapper[4849]: I0320 13:55:47.158395 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9mv8"] Mar 20 13:55:47 crc kubenswrapper[4849]: I0320 13:55:47.619257 4849 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:47 crc kubenswrapper[4849]: I0320 13:55:47.678713 4849 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.083190 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9mv8" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="registry-server" containerID="cri-o://547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500" gracePeriod=2 Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.111017 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28x88"] Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.112707 4849 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-28x88" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="registry-server" containerID="cri-o://b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b" gracePeriod=2 Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.583516 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.589192 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.654685 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvpzx\" (UniqueName: \"kubernetes.io/projected/18e5eaad-1021-4efb-bff6-042b1bfb64b9-kube-api-access-pvpzx\") pod \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.654750 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-utilities\") pod \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.654791 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-catalog-content\") pod \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\" (UID: \"18e5eaad-1021-4efb-bff6-042b1bfb64b9\") " Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.654841 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7x7k\" (UniqueName: \"kubernetes.io/projected/18e432a7-2e75-430d-92cd-718da222d369-kube-api-access-t7x7k\") pod \"18e432a7-2e75-430d-92cd-718da222d369\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.654902 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-catalog-content\") pod \"18e432a7-2e75-430d-92cd-718da222d369\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.654993 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-utilities\") pod \"18e432a7-2e75-430d-92cd-718da222d369\" (UID: \"18e432a7-2e75-430d-92cd-718da222d369\") " Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.656036 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-utilities" (OuterVolumeSpecName: "utilities") pod "18e432a7-2e75-430d-92cd-718da222d369" (UID: "18e432a7-2e75-430d-92cd-718da222d369"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.657923 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-utilities" (OuterVolumeSpecName: "utilities") pod "18e5eaad-1021-4efb-bff6-042b1bfb64b9" (UID: "18e5eaad-1021-4efb-bff6-042b1bfb64b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.667221 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e432a7-2e75-430d-92cd-718da222d369-kube-api-access-t7x7k" (OuterVolumeSpecName: "kube-api-access-t7x7k") pod "18e432a7-2e75-430d-92cd-718da222d369" (UID: "18e432a7-2e75-430d-92cd-718da222d369"). InnerVolumeSpecName "kube-api-access-t7x7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.688766 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e5eaad-1021-4efb-bff6-042b1bfb64b9-kube-api-access-pvpzx" (OuterVolumeSpecName: "kube-api-access-pvpzx") pod "18e5eaad-1021-4efb-bff6-042b1bfb64b9" (UID: "18e5eaad-1021-4efb-bff6-042b1bfb64b9"). InnerVolumeSpecName "kube-api-access-pvpzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.734243 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18e432a7-2e75-430d-92cd-718da222d369" (UID: "18e432a7-2e75-430d-92cd-718da222d369"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.757247 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvpzx\" (UniqueName: \"kubernetes.io/projected/18e5eaad-1021-4efb-bff6-042b1bfb64b9-kube-api-access-pvpzx\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.757285 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.757296 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7x7k\" (UniqueName: \"kubernetes.io/projected/18e432a7-2e75-430d-92cd-718da222d369-kube-api-access-t7x7k\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.757305 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.757313 4849 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18e432a7-2e75-430d-92cd-718da222d369-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.818915 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18e5eaad-1021-4efb-bff6-042b1bfb64b9" (UID: "18e5eaad-1021-4efb-bff6-042b1bfb64b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:55:49 crc kubenswrapper[4849]: I0320 13:55:49.858910 4849 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18e5eaad-1021-4efb-bff6-042b1bfb64b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.092444 4849 generic.go:334] "Generic (PLEG): container finished" podID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerID="b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b" exitCode=0 Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.092491 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28x88" event={"ID":"18e5eaad-1021-4efb-bff6-042b1bfb64b9","Type":"ContainerDied","Data":"b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b"} Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.092528 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-28x88" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.092780 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-28x88" event={"ID":"18e5eaad-1021-4efb-bff6-042b1bfb64b9","Type":"ContainerDied","Data":"5fdf4c78716c1b1006f9984dded256e1e28d0eb45ffde1c51323143b335e87e4"} Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.092846 4849 scope.go:117] "RemoveContainer" containerID="b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.098568 4849 generic.go:334] "Generic (PLEG): container finished" podID="18e432a7-2e75-430d-92cd-718da222d369" containerID="547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500" exitCode=0 Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.098602 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mv8" event={"ID":"18e432a7-2e75-430d-92cd-718da222d369","Type":"ContainerDied","Data":"547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500"} Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.098626 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9mv8" event={"ID":"18e432a7-2e75-430d-92cd-718da222d369","Type":"ContainerDied","Data":"f19e762490513d67b1b57edca714c853df3d6dcb4f01d4d5b2a2583651cc481e"} Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.098679 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9mv8" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.112519 4849 scope.go:117] "RemoveContainer" containerID="49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.131530 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-28x88"] Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.141995 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-28x88"] Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.149597 4849 scope.go:117] "RemoveContainer" containerID="1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.150765 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9mv8"] Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.160852 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9mv8"] Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.177044 4849 scope.go:117] "RemoveContainer" containerID="b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b" Mar 20 13:55:50 crc kubenswrapper[4849]: E0320 13:55:50.177669 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b\": container with ID starting with b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b not found: ID does not exist" containerID="b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.177707 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b"} err="failed to get container status \"b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b\": rpc error: code = NotFound desc = could not find container \"b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b\": container with ID starting with b3b44a8842e2bf849d50465e2d63a1480a222f5d31106455066648981dae7c8b not found: ID does not exist" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.177732 4849 scope.go:117] "RemoveContainer" containerID="49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59" Mar 20 13:55:50 crc kubenswrapper[4849]: E0320 13:55:50.178148 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59\": container with ID starting with 49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59 not found: ID does not exist" containerID="49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.178201 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59"} err="failed to get container status \"49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59\": rpc error: code = NotFound desc = could not find container \"49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59\": container with ID starting with 49b2369d3d7d5e24b3e0e1b42b7bbbb5382e77466c96b0b7974ae44bf6e32c59 not found: ID does not exist" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.178233 4849 scope.go:117] "RemoveContainer" containerID="1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472" Mar 20 13:55:50 crc kubenswrapper[4849]: E0320 13:55:50.178650 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472\": container with ID starting with 1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472 not found: ID does not exist" containerID="1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.178680 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472"} err="failed to get container status \"1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472\": rpc error: code = NotFound desc = could not find container \"1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472\": container with ID starting with 1ac54e2f79ac6c47e535a8ce931f9cac8f0921fe800e205385e739d3b6669472 not found: ID does not exist" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.178695 4849 scope.go:117] "RemoveContainer" containerID="547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.195163 4849 scope.go:117] "RemoveContainer" containerID="9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.211855 4849 scope.go:117] "RemoveContainer" containerID="b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.227189 4849 scope.go:117] "RemoveContainer" containerID="547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500" Mar 20 13:55:50 crc kubenswrapper[4849]: E0320 13:55:50.227556 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500\": container with ID starting with 547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500 not found: ID does not exist" containerID="547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.227595 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500"} err="failed to get container status \"547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500\": rpc error: code = NotFound desc = could not find container \"547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500\": container with ID starting with 547b9bc666cfd9bc6434c35529c7c11a65986e9e2c52679b9bbc2e8a519ea500 not found: ID does not exist" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.227619 4849 scope.go:117] "RemoveContainer" containerID="9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0" Mar 20 13:55:50 crc kubenswrapper[4849]: E0320 13:55:50.228215 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0\": container with ID starting with 9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0 not found: ID does not exist" containerID="9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.228252 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0"} err="failed to get container status \"9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0\": rpc error: code = NotFound desc = could not find container \"9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0\": container with ID starting with 9d29efd9a8be9ed75a59b21a6a7985d0c400a829fdcab16f4cd15f29f7a2b7f0 not found: ID does not exist" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.228279 4849 scope.go:117] "RemoveContainer" containerID="b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f" Mar 20 13:55:50 crc kubenswrapper[4849]: E0320 13:55:50.228642 4849 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f\": container with ID starting with b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f not found: ID does not exist" containerID="b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f" Mar 20 13:55:50 crc kubenswrapper[4849]: I0320 13:55:50.228667 4849 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f"} err="failed to get container status \"b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f\": rpc error: code = NotFound desc = could not find container \"b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f\": container with ID starting with b0f5e9b1279cf22267fba02ca0f164e1b841eeb41b31ce720ecd79cd9a0fc10f not found: ID does not exist" Mar 20 13:55:51 crc kubenswrapper[4849]: I0320 13:55:51.045583 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e432a7-2e75-430d-92cd-718da222d369" path="/var/lib/kubelet/pods/18e432a7-2e75-430d-92cd-718da222d369/volumes" Mar 20 13:55:51 crc kubenswrapper[4849]: I0320 13:55:51.047114 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" path="/var/lib/kubelet/pods/18e5eaad-1021-4efb-bff6-042b1bfb64b9/volumes" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.145584 4849 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566916-gb9sh"] Mar 20 13:56:00 crc kubenswrapper[4849]: E0320 13:56:00.146320 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146333 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4849]: E0320 13:56:00.146350 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146356 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4849]: E0320 13:56:00.146365 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146371 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4849]: E0320 13:56:00.146386 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146392 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4849]: E0320 13:56:00.146411 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146417 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4849]: E0320 13:56:00.146427 4849 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146433 4849 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146599 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e432a7-2e75-430d-92cd-718da222d369" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.146614 4849 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e5eaad-1021-4efb-bff6-042b1bfb64b9" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.147158 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.149404 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.152616 4849 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.153132 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-gb9sh"] Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.156107 4849 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-x4fhr" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.284037 4849 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqlf\" (UniqueName: \"kubernetes.io/projected/5322630e-17ba-494d-af6a-e58195482453-kube-api-access-xlqlf\") pod \"auto-csr-approver-29566916-gb9sh\" (UID: \"5322630e-17ba-494d-af6a-e58195482453\") " pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.385462 4849 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqlf\" (UniqueName: \"kubernetes.io/projected/5322630e-17ba-494d-af6a-e58195482453-kube-api-access-xlqlf\") pod \"auto-csr-approver-29566916-gb9sh\" (UID: \"5322630e-17ba-494d-af6a-e58195482453\") " pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.407642 4849 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqlf\" (UniqueName: \"kubernetes.io/projected/5322630e-17ba-494d-af6a-e58195482453-kube-api-access-xlqlf\") pod \"auto-csr-approver-29566916-gb9sh\" (UID: \"5322630e-17ba-494d-af6a-e58195482453\") " pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.472264 4849 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:00 crc kubenswrapper[4849]: I0320 13:56:00.928105 4849 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-gb9sh"] Mar 20 13:56:01 crc kubenswrapper[4849]: I0320 13:56:01.193507 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" event={"ID":"5322630e-17ba-494d-af6a-e58195482453","Type":"ContainerStarted","Data":"b8aea55c1c8854c504dbf7d8b2a5dea8a8219727abcefcf5c9e3da47b754b989"} Mar 20 13:56:03 crc kubenswrapper[4849]: I0320 13:56:03.211729 4849 generic.go:334] "Generic (PLEG): container finished" podID="5322630e-17ba-494d-af6a-e58195482453" containerID="4242cf25910937d6192e603994ea2e0ca80694d735a7bc3867d7b609c1ae8830" exitCode=0 Mar 20 13:56:03 crc kubenswrapper[4849]: I0320 13:56:03.211867 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" event={"ID":"5322630e-17ba-494d-af6a-e58195482453","Type":"ContainerDied","Data":"4242cf25910937d6192e603994ea2e0ca80694d735a7bc3867d7b609c1ae8830"} Mar 20 13:56:04 crc kubenswrapper[4849]: I0320 13:56:04.042363 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bsbt2"] Mar 20 13:56:04 crc kubenswrapper[4849]: I0320 13:56:04.049980 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bsbt2"] Mar 20 13:56:04 crc kubenswrapper[4849]: I0320 13:56:04.620790 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:04 crc kubenswrapper[4849]: I0320 13:56:04.657715 4849 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqlf\" (UniqueName: \"kubernetes.io/projected/5322630e-17ba-494d-af6a-e58195482453-kube-api-access-xlqlf\") pod \"5322630e-17ba-494d-af6a-e58195482453\" (UID: \"5322630e-17ba-494d-af6a-e58195482453\") " Mar 20 13:56:04 crc kubenswrapper[4849]: I0320 13:56:04.664455 4849 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5322630e-17ba-494d-af6a-e58195482453-kube-api-access-xlqlf" (OuterVolumeSpecName: "kube-api-access-xlqlf") pod "5322630e-17ba-494d-af6a-e58195482453" (UID: "5322630e-17ba-494d-af6a-e58195482453"). InnerVolumeSpecName "kube-api-access-xlqlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:04 crc kubenswrapper[4849]: I0320 13:56:04.760264 4849 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlqlf\" (UniqueName: \"kubernetes.io/projected/5322630e-17ba-494d-af6a-e58195482453-kube-api-access-xlqlf\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:05 crc kubenswrapper[4849]: I0320 13:56:05.047199 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c100d127-fda4-4f86-89d7-64a19be3e8ea" path="/var/lib/kubelet/pods/c100d127-fda4-4f86-89d7-64a19be3e8ea/volumes" Mar 20 13:56:05 crc kubenswrapper[4849]: I0320 13:56:05.234047 4849 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" event={"ID":"5322630e-17ba-494d-af6a-e58195482453","Type":"ContainerDied","Data":"b8aea55c1c8854c504dbf7d8b2a5dea8a8219727abcefcf5c9e3da47b754b989"} Mar 20 13:56:05 crc kubenswrapper[4849]: I0320 13:56:05.234079 4849 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8aea55c1c8854c504dbf7d8b2a5dea8a8219727abcefcf5c9e3da47b754b989" Mar 20 13:56:05 crc kubenswrapper[4849]: I0320 13:56:05.234082 4849 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-gb9sh" Mar 20 13:56:05 crc kubenswrapper[4849]: I0320 13:56:05.671048 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-5f9s4"] Mar 20 13:56:05 crc kubenswrapper[4849]: I0320 13:56:05.683352 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-5f9s4"] Mar 20 13:56:07 crc kubenswrapper[4849]: I0320 13:56:07.046171 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dbd39df-02d8-4bc2-8953-9618afa3138d" path="/var/lib/kubelet/pods/7dbd39df-02d8-4bc2-8953-9618afa3138d/volumes" Mar 20 13:56:26 crc kubenswrapper[4849]: I0320 13:56:26.042085 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxjmr"] Mar 20 13:56:26 crc kubenswrapper[4849]: I0320 13:56:26.052455 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bxjmr"] Mar 20 13:56:26 crc kubenswrapper[4849]: I0320 13:56:26.060102 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccqkc"] Mar 20 13:56:26 crc kubenswrapper[4849]: I0320 13:56:26.066995 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ccqkc"] Mar 20 13:56:27 crc kubenswrapper[4849]: I0320 13:56:27.047149 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9903d0-8cc8-4bce-99da-96d1e8657e2a" path="/var/lib/kubelet/pods/ce9903d0-8cc8-4bce-99da-96d1e8657e2a/volumes" Mar 20 13:56:27 crc kubenswrapper[4849]: I0320 13:56:27.048523 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfff4046-20be-4224-8bc7-0741b2fd01a7" path="/var/lib/kubelet/pods/cfff4046-20be-4224-8bc7-0741b2fd01a7/volumes" Mar 20 13:56:45 crc kubenswrapper[4849]: I0320 13:56:45.020704 4849 scope.go:117] "RemoveContainer" containerID="7ed75db7a040c41a3c2f0b158b085a98f26a5f20c673bec43eada26c5e893852" Mar 20 13:56:45 crc kubenswrapper[4849]: I0320 13:56:45.068514 4849 scope.go:117] "RemoveContainer" containerID="c6da98ff43e0f1a5e583a16af2d1aec0ab25b16b45b2ad6de2f667d368e756f3" Mar 20 13:56:45 crc kubenswrapper[4849]: I0320 13:56:45.114201 4849 scope.go:117] "RemoveContainer" containerID="b2e1de513c310954fd87fb1ed6adf147981f9125667ee41d366af6f0e226eacd" Mar 20 13:56:45 crc kubenswrapper[4849]: I0320 13:56:45.158686 4849 scope.go:117] "RemoveContainer" containerID="662bd500bcc5017ab0e41959fcd1d810385d49d194b351fc9a42280103fe4006" Mar 20 13:57:10 crc kubenswrapper[4849]: I0320 13:57:10.055631 4849 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-qj5cd"] Mar 20 13:57:10 crc kubenswrapper[4849]: I0320 13:57:10.069180 4849 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-qj5cd"] Mar 20 13:57:11 crc kubenswrapper[4849]: I0320 13:57:11.050279 4849 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a" path="/var/lib/kubelet/pods/e40d1b51-4a54-41f3-bc73-2e9c4e6dff1a/volumes"